121 resultados para Systemic changes and turbulences
Resumo:
Adding to the on-going debate regarding vegetation recolonisation (more particularly the timing) in Europe and climate change since the Lateglacial, this study investigates a long sediment core (LL081) from Lake Ledro (652ma.s.l., southern Alps, Italy). Environmental changes were reconstructed using multiproxy analysis (pollen-based vegetation and climate reconstruction, lake levels, magnetic susceptibility and X-ray fluorescence (XRF) measurements) recorded climate and land-use changes during the Lateglacial and early-middle Holocene. The well-dated and high-resolution pollen record of Lake Ledro is compared with vegetation records from the southern and northern Alps to trace the history of tree species distribution. An altitudedependent progressive time delay of the first continuous occurrence of Abies (fir) and of the Larix (larch) development has been observed since the Lateglacial in the southern Alps. This pattern suggests that the mid-altitude Lake Ledro area was not a refuge and that trees originated from lowlands or hilly areas (e.g. Euganean Hills) in northern Italy. Preboreal oscillations (ca. 11 000 cal BP), Boreal oscillations (ca. 10 200, 9300 cal BP) and the 8.2 kyr cold event suggest a centennial-scale climate forcing in the studied area. Picea (spruce) expansion occurred preferentially around 10 200 and 8200 cal BP in the south-eastern Alps, and therefore reflects the long-lasting cumulative effects of successive boreal and the 8.2 kyr cold event. The extension of Abies is contemporaneous with the 8.2 kyr event, but its development in the southern Alps benefits from the wettest interval 8200-7300 cal BP evidenced in high lake levels, flood activity and pollen-based climate reconstructions. Since ca. 7500 cal BP, a weak signal of pollen-based anthropogenic activities suggest weak human impact. The period between ca. 5700 and ca. 4100 cal BP is considered as a transition period to colder and wetter conditions (particularly during summers) that favoured a dense beech (Fagus) forest development which in return caused a distinctive yew (Taxus) decline.We conclude that climate was the dominant factor controlling vegetation changes and erosion processes during the early and middle Holocene (up to ca. 4100 cal BP).
Resumo:
Adding to the on-going debate regarding vegetation recolonisation (more particularly the timing) in Europe and climate change since the Lateglacial, this study investigates a long sediment core (LL081) from Lake Ledro (652ma.s.l., southern Alps, Italy). Environmental changes were reconstructed using multiproxy analysis (pollen-based vegetation and climate reconstruction, lake levels, magnetic susceptibility and X-ray fluorescence (XRF) measurements) recorded climate and land-use changes during the Lateglacial and early-middle Holocene. The well-dated and high-resolution pollen record of Lake Ledro is compared with vegetation records from the southern and northern Alps to trace the history of tree species distribution. An altitudedependent progressive time delay of the first continuous occurrence of Abies (fir) and of the Larix (larch) development has been observed since the Lateglacial in the southern Alps. This pattern suggests that the mid-altitude Lake Ledro area was not a refuge and that trees originated from lowlands or hilly areas (e.g. Euganean Hills) in northern Italy. Preboreal oscillations (ca. 11 000 cal BP), Boreal oscillations (ca. 10 200, 9300 cal BP) and the 8.2 kyr cold event suggest a centennial-scale climate forcing in the studied area. Picea (spruce) expansion occurred preferentially around 10 200 and 8200 cal BP in the south-eastern Alps, and therefore reflects the long-lasting cumulative effects of successive boreal and the 8.2 kyr cold event. The extension of Abies is contemporaneous with the 8.2 kyr event, but its development in the southern Alps benefits from the wettest interval 8200-7300 cal BP evidenced in high lake levels, flood activity and pollen-based climate reconstructions. Since ca. 7500 cal BP, a weak signal of pollen-based anthropogenic activities suggest weak human impact. The period between ca. 5700 and ca. 4100 cal BP is considered as a transition period to colder and wetter conditions (particularly during summers) that favoured a dense beech (Fagus) forest development which in return caused a distinctive yew (Taxus) decline.We conclude that climate was the dominant factor controlling vegetation changes and erosion processes during the early and middle Holocene (up to ca. 4100 cal BP).
Modelling the effects of land use and climate changes on hydrology in the Ursern Valley, Switzerland
Resumo:
While many studies have been conducted in mountainous catchments to examine the impact of climate change on hydrology, the interactions between climate changes and land use components have largely unknown impacts on hydrology in alpine regions. They need to be given special attention in order to devise possible strategies concerning general development in these regions. Thus, the main aim was to examine the impact of land use (i.e. bushland expansion) and climate changes (i.e. increase of temperature) on hydrology by model simulations. For this purpose, the physically based WaSiM-ETH model was applied to the catchment of Ursern Valley in the central Alps (191 km2) over the period of 1983−2005. Modelling results showed that the reduction of the mean monthly discharge during the summer period is due primarily to the retreat of snow discharge in time and secondarily to the reduction in the glacier surface area together with its retreat in time, rather than the increase in the evapotranspiration due to the expansion of the “green alder” on the expense of grassland. The significant decrease in summer discharge during July, August and September shows a change in the regime from b-glacio-nival to nivo-glacial. These changes are confirmed by the modeling results that attest to a temporal shift in snowmelt and glacier discharge towards earlier in the year: March, April and May for snowmelt and May and June for glacier discharge. It is expected that the yearly total discharge due to the land use changes will be reduced by 0.6% in the near future, whereas, it will be reduced by about 5% if climate change is also taken into account. Copyright © 2013 John Wiley & Sons, Ltd.
Developmental changes in sleep biology and potential effects on adolescent behavior and caffeine use
Resumo:
Adolescent development includes changes in the biological regulatory processes for the timing of sleep. Circadian rhythm changes and changes to the sleep-pressure system (sleep homeostasis) during adolescence both favor later timing of sleep. These changes, combined with prevailing social pressures, are responsible for most teens sleeping too late and too little; those who sleep least report consuming more caffeine. Although direct research findings are scarce, the likelihood of use and abuse of caffeine-laden products grows across the adolescent years due, in part, to excessive sleepiness
Resumo:
PURPOSE To evaluate and compare crestal bone level changes and peri-implant status of implant-supported reconstructions in edentulous and partially dentate patients after a minimum of 5 years of loading. MATERIALS AND METHODS All patients who received a self-tapping implant with a microstructured surface during the years 2003 and 2004 at the Department of Prosthodontics, University of Bern, were included in this study. The implant restorations comprised fixed and removable prostheses for partially and completely edentulous patients. Radiographs were taken immediately after surgery, at impression making, and 1 and 5 years after loading. Crestal bone level (BIC) was measured from the implant shoulder to the first bone contact, and changes were calculated over time (ΔBIC). The associations between pocket depth, bleeding on probing (BOP), and ΔBIC were assessed. RESULTS Sixty-one implants were placed in 20 patients (mean age, 62 ± 7 years). At the 5-year follow-up, 19 patients with 58 implants were available. Implant survival was 98.4% (one early failure; one patient died). The average ΔBIC between surgery and 5-year follow-up was 1.5 ± 0.9 mm and 1.1 ± 0.6 mm for edentulous and partially dentate patients, respectively. Most bone resorption (50%, 0.7 mm) occurred during the first 3 months (osseointegration) and within the first year of loading (21%, 0.3 mm). Mean annual bone loss during the 5 years of loading was < 0.12 mm. Mean pocket depth was 2.6 ± 0.7 mm. Seventeen percent of the implant sites displayed BOP; the frequency was significantly higher in women. None of the variables were significantly associated with crestal bone loss. CONCLUSION Crestal bone loss after 5 years was within the normal range, without a significant difference between edentulous and partially dentate patients. In the short term, this implant system can be used successfully for various prosthetic indications.
Resumo:
Clinically, the displacement of intravertebral fat into the circulation during vertebroplasty is reported to lead to problems in elderly patients and can represent a serious complication, especially when multiple levels have to be treated. An in vitro study has shown the feasibility of removing intravertebral fat by pulsed jet-lavage prior to vertebroplasty, potentially reducing the embolization of bone marrow fat from the vertebral bodies and alleviating the cardiovascular changes elicited by pulmonary fat embolism. In this in vivo study, percutaneous vertebroplasty using polymethylmethacrylate (PMMA) was performed in three lumbar vertebrae of 11 sheep. In six sheep (lavage group), pulsed jet-lavage was performed prior to injection of PMMA compared to the control group of five sheep receiving only PMMA vertebroplasty. Invasive recording of blood pressures was performed continuously until 60 min after the last injection. Cardiac output and arterial blood gas parameters were measured at selected time points. Post mortem, the injected cement volume was measured using CT and lung biopsies were processed for assessment of intravascular fat. Pulsed jet-lavage was feasible in the in vivo setting. In the control group, the injection of PMMA resulted in pulmonary fat embolism and a sudden and significant increase in mean pulmonary arterial pressure. Pulsed jet-lavage prevented any cardiovascular changes and significantly reduced the severity of bone marrow fat embolization. Even though significantly more cement had been injected into the lavaged vertebral bodies, significantly fewer intravascular fat emboli were identified in the lung tissue. Pulsed jet-lavage prevented the cardiovascular complications after PMMA vertebroplasty in sheep and alleviated the severity of pulmonary fat embolism.
Resumo:
In Alagille syndrome, routine follow-up imaging of the liver plays an important role in detecting early parenchymal changes and to evaluate portal hypertension. Modern contrast-enhanced imaging methods not only allow early detection of focal liver lesions, but also enable further characterization of their nature and guide biopsy procedures. We present the US and MR imaging findings of hepatocellular carcinoma and a regenerating nodule in a 3-year-old child with Alagille syndrome.
Resumo:
Diagnostic and therapeutic approaches to trauma patients are, depending on experience, equipment and different therapeutic doctrines, subject to wide variations. The ability to compare trauma centres using a standardised trauma register helps to reveal unresolved systemic issues and simplifies the quality management in an Emergency Department (ED).
Resumo:
The embryonic head development, including the formation of dental structures, is a complex and delicate process guided by specific genetic programs. Genetic changes and environmental factors can disturb the execution of these programs and result in abnormalities in orofacial and dental structures. Orofacial clefts and hypodontia/ oligodontia are examples of such abnormalities frequently seen in dental clinics. An insight into the mechanisms and genes involved in the formation of orofacial and dental structures has been gradually gained by genetic analysis of families and by the use of experimental vertebrate models such as the mouse and chick models. The development of novel clinical therapies for orofacial and dental pathological conditions depends very much on a detailed knowledge of the molecular and cellular processes that are involved in head formation.
Resumo:
Data on rainfall, runoff and sediment loss from different land use types have been collected by the Soil Conservation Research Programme in seven small catchments (73-673 hectares) throughout the Ethiopian Highlands since the early 1980s. Monitoring was carried out on a storm-to-storm basis for extended periods of 10-20 years, and the data are analysed here to assess long-term effects of changes. Soil and water conservation technologies were introduced in the early years in the catchments in view of their capacity to reduce runoff and sediment yield. Results indicate that rainfall did not substantially change over the observation periods. Land use changes and land degradation, however, altered runoff, as shown by the data from small test plots (30 m2), which were not altered by conservation measures during the monitoring periods. Sediment delivery from the catchments may have decreased due to soil and water conservation, while runoff rates did not change significantly. Extrapolation of the results in the highlands, however, showed that expansion of cultivated and grazing land induced by population growth may have increased the overall surface runoff. Watershed management in the catchments, finally, had beneficial effects on ecosystem services by reducing soil erosion, restoring soil fertility, enhancing agricultural production, and maintaining overall runoff to the benefit of lowland areas and neighbouring countries.
Resumo:
Sirtuins and hypoxia-inducible transcription factors (HIF) have well-established roles in regulating cellular responses to metabolic and oxidative stress. Recent reports have linked these two protein families by demonstrating that sirtuins can regulate the activity of HIF-1 and HIF-2. Here we investigated the role of SIRT1, a NAD+-dependent deacetylase, in the regulation of HIF-1 activity in hypoxic conditions. Our results show that in hepatocellular carcinoma (HCC) cell lines, hypoxia did not alter SIRT1 mRNA or protein expression, whereas it predictably led to the accumulation of HIF-1α and the up-regulation of its target genes. In hypoxic models in vitro and in in vivo models of systemic hypoxia and xenograft tumor growth, knockdown of SIRT1 protein with shRNA or inhibition of its activity with small molecule inhibitors impaired the accumulation of HIF-1α protein and the transcriptional increase of its target genes. In addition, endogenous SIRT1 and HIF-1α proteins co-immunoprecipitated and loss of SIRT1 activity led to a hyperacetylation of HIF-1α. Taken together, our data suggest that HIF-1α and SIRT1 proteins interact in HCC cells and that HIF-1α is a target of SIRT1 deacetylase activity. Moreover, SIRT1 is necessary for HIF-1α protein accumulation and activation of HIF-1 target genes under hypoxic conditions.
Resumo:
The concept of chronic critical limb ischaemia (CLI) emerged late in the history of peripheral arterial occlusive disease (PAOD). The historical background and changing definitions of CLI over the last decades are important to know in order to understand why epidemiologic data are so difficult to compare between articles and over time. The prevalence of CLI is probably very high and largely underestimated, and significant differences exist between population studies and clinical series. The extremely high costs associated with management of these patients make CLI a real public health issue for the future. In the era of emerging vascular surgery in the 1950s, the initial classification of PAOD by Fontaine, with stages III and IV corresponding to CLI, was based only on clinical symptoms. Later, with increasing access to non-invasive haemodynamic measurements (ankle pressure, toe pressure), the need to prove a causal relationship between PAOD and clinical findings suggestive of CLI became a real concern, and the Rutherford classification published in 1986 included objective haemodynamic criteria. The first consensus document on CLI was published in 1991 and included clinical criteria associated with ankle and toe pressure and transcutaneous oxygen pressure (TcPO(2)) cut-off levels <50 mmHg, <30 mmHg and <10 mmHg respectively). This rigorous definition reflects an arterial insufficiency that is so severe as to cause microcirculatory changes and compromise tissue integrity, with a high rate of major amputation and mortality. The TASC I consensus document published in 2000 used less severe pressure cut-offs (≤ 50-70 mmHg, ≤ 30-50 mmHg and ≤ 30-50 mmHg respectively). The thresholds for toe pressure and especially TcPO(2) (which will be also included in TASC II consensus document) are however just below the lower limit of normality. It is therefore easy to infer that patients qualifying as CLI based on TASC criteria can suffer from far less severe disease than those qualifying as CLI in the initial 1991 consensus document. Furthermore, inclusion criteria of many recent interventional studies have even shifted further from the efforts of definition standardisation with objective criteria, by including patients as CLI based merely on Fontaine classification (stage III and IV) without haemodynamic criteria. The differences in the natural history of patients with CLI, including prognosis of the limb and the patient, are thus difficult to compare between studies in this context. Overall, CLI as defined by clinical and haemodynamic criteria remains a severe condition with poor prognosis, high medical costs and a major impact in terms of public health and patients' loss of functional capacity. The major progresses in best medical therapy of arterial disease and revascularisation procedures will certainly improve the outcome of CLI patients. In the future, an effort to apply a standardised definition with clinical and objective haemodynamic criteria will be needed to better demonstrate and compare the advances in management of these patients.
Resumo:
Impacts of low-latitude, explosive volcanic eruptions on climate and the carbon cycle are quantified by forcing a comprehensive, fully coupled carbon cycle-climate model with pulse-like stratospheric aerosol optical depth changes. The model represents the radiative and dynamical response of the climate system to volcanic eruptions and simulates a decrease of global and regional atmospheric surface temperature, regionally distinct changes in precipitation, a positive phase of the North Atlantic Oscillation, and a decrease in atmospheric CO2 after volcanic eruptions. The volcanic-induced cooling reduces overturning rates in tropical soils, which dominates over reduced litter input due to soil moisture decrease, resulting in higher land carbon inventories for several decades. The perturbation in the ocean carbon inventory changes sign from an initial weak carbon sink to a carbon source. Positive carbon and negative temperature anomalies in subsurface waters last up to several decades. The multi-decadal decrease in atmospheric CO2 yields a small additional radiative forcing that amplifies the cooling and perturbs the Earth System on longer time scales than the atmospheric residence time of volcanic aerosols. In addition, century-scale global warming simulations with and without volcanic eruptions over the historical period show that the ocean integrates volcanic radiative cooling and responds for different physical and biogeochemical parameters such as steric sea level or dissolved oxygen. Results from a suite of sensitivity simulations with different magnitudes of stratospheric aerosol optical depth changes and from global warming simulations show that the carbon cycle-climate sensitivity γ, expressed as change in atmospheric CO2 per unit change in global mean surface temperature, depends on the magnitude and temporal evolution of the perturbation, and time scale of interest. On decadal time scales, modeled γ is several times larger for a Pinatubo-like eruption than for the industrial period and for a high emission, 21st century scenario.
Resumo:
Data on antimicrobial use play a key role in the development of policies for the containment of antimicrobial resistance. On-farm data could provide a detailed overview of the antimicrobial use, but technical and methodological aspects of data collection and interpretation, as well as data quality need to be further assessed. The aims of this study were (1) to quantify antimicrobial use in the study population using different units of measurement and contrast the results obtained, (2) to evaluate data quality of farm records on antimicrobial use, and (3) to compare data quality of different recording systems. During 1 year, data on antimicrobial use were collected from 97 dairy farms. Antimicrobial consumption was quantified using: (1) the incidence density of antimicrobial treatments; (2) the weight of active substance; (3) the used daily dose and (4) the used course dose for antimicrobials for intestinal, intrauterine and systemic use; and (5) the used unit dose, for antimicrobials for intramammary use. Data quality was evaluated by describing completeness and accuracy of the recorded information, and by comparing farmers' and veterinarians' records. Relative consumption of antimicrobials depended on the unit of measurement: used doses reflected the treatment intensity better than weight of active substance. The use of antimicrobials classified as high priority was low, although under- and overdosing were frequently observed. Electronic recording systems allowed better traceability of the animals treated. Recording drug name or dosage often resulted in incomplete or inaccurate information. Veterinarians tended to record more drugs than farmers. The integration of veterinarian and farm data would improve data quality.
Resumo:
Resting heart rate is a promising modifiable cardiovascular risk marker in older adults, but the mechanisms linking heart rate to cardiovascular disease are not fully understood. We aimed to assess the association between resting heart rate and incident heart failure (HF) and cardiovascular mortality, and to examine whether these associations might be attributable to systemic inflammation and endothelial dysfunction.