947 resultados para Elasticity of output with respect to factors


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Osteoporosis (OP) is a systemic skeletal disease characterized by a low bone mineral density (BMD) and a micro-architectural (MA) deterioration. Clinical risk factors (CRF) are often used as a MA approximation. MA is yet evaluable in daily practice by the trabecular bone score (TBS) measure. TBS is very simple to obtain, by reanalyzing a lumbar DXA-scan. TBS has proven to have diagnosis and prognosis values, partially independent of CRF and BMD. The aim of the OsteoLaus cohort is to combine in daily practice the CRF and the information given by DXA (BMD, TBS and vertebral fracture assessment (VFA)) to better identify women at high fracture risk. The OsteoLaus cohort (1400 women 50 to 80 years living in Lausanne, Switzerland) started in 2010. This study is derived from the cohort COLAUS who started in Lausanne in 2003. The main goal of COLAUS is to obtain information on the epidemiology and genetic determinants of cardiovascular risk in 6700 men and women. CRF for OP, bone ultrasound of the heel, lumbar spine and hip BMD, VFA by DXA and MA evaluation by TBS are recorded in OsteoLaus. Preliminary results are reported. We included 631 women: mean age 67.4 ± 6.7 years, BMI 26.1 ± 4.6, mean lumbar spine BMD 0.943 ± 0.168 (T-score − 1.4 SD), and TBS 1.271 ± 0.103. As expected, correlation between BMD and site matched TBS is low (r2 = 0.16). Prevalence of VFx grade 2/3, major OP Fx and all OP Fx is 8.4%, 17.0% and 26.0% respectively. Age- and BMI-adjusted ORs (per SD decrease) are 1.8 (1.2-2.5), 1.6 (1.2-2.1), and 1.3 (1.1-1.6) for BMD for the different categories of fractures and 2.0 (1.4-3.0), 1.9 (1.4-2.5), and 1.4 (1.1-1.7) for TBS respectively. Only 32 to 37% of women with OP Fx have a BMD < − 2.5 SD or a TBS < 1.200. If we combine a BMD < − 2.5 SD or a TBS < 1.200, 54 to 60% of women with an osteoporotic Fx are identified. As in the already published studies, these preliminary results confirm the partial independence between BMD and TBS. More importantly, a combination of TBS subsequent to BMD increases significantly the identification of women with prevalent OP Fx which would have been misclassified by BMD alone. For the first time we are able to have complementary information about fracture (VFA), density (BMD), micro- and macro architecture (TBS and HAS) from a simple, low ionizing radiation and cheap device: DXA. Such complementary information is very useful for the patient in the daily practice and moreover will likely have an impact on cost effectiveness analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the outpatient setting, the long-term management of cardiovascular risk factors is essential to prevent recurrent cardiovascular disease. Recent studies have shown an additional benefit of beginning cardiovascular secondary prevention during the hospital stay. Early, in-hospital initiation of proven beneficial medications, such as aspirin or blood lipid lowering drugs and therapeutic lifestyle change counseling, improves patients' long-term outcome, as long as there is continuity of care in the outpatient setting. A recent hospitalization may be a teachable moment, when patients are more likely to modify their health behaviors. The continuity of care between in-hospital medicine and the outpatient setting helps patients in the long-term management of their cardiovascular disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When a rubber hand is placed on a table top in a plausible position as if part of a person"s body, and is stroked synchronously with the person"s corresponding hidden real hand, an illusion of ownership over the rubber hand can occur (Botvinick and Cohen 1998). A similar result has been found with respect to a virtual hand portrayed in a virtual environment, a virtual hand illusion (Slater et al. 2008). The conditions under which these illusions occur have been the subject of considerable study. Here we exploited the flexibility of virtual reality to examine four contributory factors: visuo-tactile synchrony while stroking the virtual and the real arms, body continuity, alignment between the real and virtual arms, and the distance between them. We carried out three experiments on a total of 32 participants where these factors were varied. The results show that the subjective illusion of ownership over the virtual arm and the time to evoke this illusion are highly dependent on synchronous visuo-tactile stimulation and on connectivity of the virtual arm with the rest of the virtual body. The alignment between the real and virtual arms and the distance between these were less important. It was found that proprioceptive drift was not a sensitive measure of the illusion, but was only related to the distance between the real and virtual arms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Analysing the observed differences for incidence or mortality of a particular disease between two different situations (such as time points, geographical areas, gender or other social characteristics) can be useful both for scientific or administrative purposes. From an epidemiological and public health point of view, it is of great interest to assess the effect of demographic factors in these observed differences in order to elucidate the effect of the risk of developing a disease or dying from it. The method proposed by Bashir and Estève, which splits the observed variation into three components: risk, population structure and population size is a common choice at practice. Results A web-based application, called RiskDiff has been implemented (available at http://rht.iconcologia.net/riskdiff.htm webcite), to perform this kind of statistical analyses, providing text and graphical summaries. Code from the implemented functions in R is also provided. An application to cancer mortality data from Catalonia is used for illustration. Conclusions Combining epidemiological with demographical factors is crucial for analysing incidence or mortality from a disease, especially if the population pyramids show substantial differences. The tool implemented may serve to promote and divulgate the use of this method to give advice for epidemiologic interpretation and decision making in public health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Despite progress during recent decades, long-term outcome¦of patients with pancreatic cancer remains dismal. Since positive resection¦margins and metastatic lymph nodes are known risk factors for early tumor¦recurrence, patients at risk should be identified and could potentially benefit¦from preoperative radio-chemotherapy. This study aimed to assess whether the¦presence of lymph node metastasis could be used to predict positive resection¦margins in patients with pancreatic cancer.¦Methods: A series of 146 patients (82 male, 64 female, median age 68 years)¦underwent pancreatic head resection for various malignant diseases (pancreatic¦ductal adenocarcinoma, biliary cancer, periampullary cancer) at our institution¦from 2000 to 2011. Patients were identified from our prospective database¦that collects more than 60 single items of all patients undergoing pancreatic¦resection. Lymph node metastasis and positive resection margins were all¦confirmed by histological evaluation. Positive predictive value (PPV), negative¦predictive value (NPV), sensitivity and specificity were calculated to assess¦the predictive value of metastatic lymph nodes regarding tumor-free (R0) and¦tumor-involved (R1) resection margins.¦Results: There were 110 specimens (76%) with tumor-positive lymph nodes¦and 36 specimens with tumor-negative lymph nodes. Resection margins were¦positive in 47 specimens (32%) and negative in 99 specimens. Sensitivity of¦tumor-positive lymph nodes to detect positive resection margins was 96%, and¦the NPV was 94%. In contrast, specificity was 34% and the PPV was 41%.¦Conclusion: Patients with resectable pancreatic cancer, who have no lymph¦node metastasis, are at very low risk to have positive resection margins (2 of¦36 patients, NPV 94%). In contrast, more than one third of patients with¦metastatic lymph nodes are at increased risk for an incomplete tumor resection¦(sensitivity 96%). If lymph nodesmetastases are highly suspected at preoperative¦staging, a neoadjuvant treatment strategy should be considered to increase the¦R0 resection rate.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Quinupristin-dalfopristin (Q-D) is an injectable streptogramin active against most gram-positive pathogens, including methicillin-resistant Staphylococcus aureus (MRSA). In experimental endocarditis, however, Q-D was less efficacious against MRSA isolates constitutively resistant to macrolide-lincosamide-streptogram B (C-MLS(B)) than against MLS(B)-susceptible isolates. To circumvent this problem, we used the checkerboard method to screen drug combinations that would increase the efficacy of Q-D against such bacteria. beta-Lactams consistently exhibited additive or synergistic activity with Q-D. Glycopeptides, quinolones, and aminoglycosides were indifferent. No drugs were antagonistic. The positive Q-D-beta-lactam interaction was independent of MLS(B) or beta-lactam resistance. Moreover, addition of Q-D at one-fourth the MIC to flucloxacillin-containing plates decreased the flucloxacillin MIC for MRSA from 500 to 1,000 mg/liter to 30 to 60 mg/liter. Yet, Q-D-beta-lactam combinations were not synergistic in bactericidal tests. Rats with aortic vegetations were infected with two C-MLS(B)-resistant MRSA isolates (isolates AW7 and P8) and were treated for 3 or 5 days with drug dosages simulating the following treatments in humans: (i) Q-D at 7 mg/kg two times a day (b.i.d.) (a relatively low dosage purposely used to help detect positive drug interactions), (ii) cefamandole at constant levels in serum of 30 mg/liter, (iii) cefepime at 2 g b.i.d., (iv) Q-D combined with either cefamandole or cefepime. Any of the drugs used alone resulted in treatment failure. In contrast, Q-D plus either cefamandole or cefepime significantly decreased valve infection compared to the levels of infection for both untreated controls and those that received monotherapy (P &lt; 0.05). Importantly, Q-D prevented the growth of highly beta-lactam-resistant MRSA in vivo. The mechanism of this beneficial drug interaction is unknown. However, Q-D-beta-lactam combinations might be useful for the treatment of complicated infections caused by multiple organisms, including MRSA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: Resuscitation in severe head injury may be detrimental when given with hypotonic fluids. We evaluated the effects of lactated Ringer's solution (sodium 131 mmol/L, 277 mOsm/L) compared with hypertonic saline (sodium 268 mmol/L, 598 mOsm/L) in severely head-injured children over the first 3 days after injury. DESIGN: An open, randomized, and prospective study. SETTING: A 16-bed pediatric intensive care unit (ICU) (level III) at a university children's hospital. PATIENTS: A total of 35 consecutive children with head injury. INTERVENTIONS: Thirty-two children with Glasgow Coma Scores of <8 were randomly assigned to receive either lactated Ringer's solution (group 1) or hypertonic saline (group 2). Routine care was standardized, and included the following: head positioning at 30 degrees; normothermia (96.8 degrees to 98.6 degrees F [36 degrees to 37 degrees C]); analgesia and sedation with morphine (10 to 30 microg/kg/hr), midazolam (0.2 to 0.3 mg/kg/hr), and phenobarbital; volume-controlled ventilation (PaCO2 of 26.3 to 30 torr [3.5 to 4 kPa]); and optimal oxygenation (PaO2 of 90 to 105 torr [12 to 14 kPa], oxygen saturation of >92%, and hematocrit of >0.30). MEASUREMENTS AND MAIN RESULTS: Mean arterial pressure and intracranial pressure (ICP) were monitored continuously and documented hourly and at every intervention. The means of every 4-hr period were calculated and serum sodium concentrations were measured at the same time. An ICP of 15 mm Hg was treated with a predefined sequence of interventions, and complications were documented. There was no difference with respect to age, male/female ratio, or initial Glasgow Coma Score. In both groups, there was an inverse correlation between serum sodium concentration and ICP (group 1: r = -.13, r2 = .02, p < .03; group 2: r = -.29, r2 = .08, p < .001) that disappeared in group 1 and increased in group 2 (group 1: r = -.08, r2 = .01, NS; group 2: r = -.35, r2 =.12, p < .001). Correlation between serum sodium concentration and cerebral perfusion pressure (CPP) became significant in group 2 after 8 hrs of treatment (r = .2, r2 = .04, p = .002). Over time, ICP and CPP did not significantly differ between the groups. However, to keep ICP at <15 mm Hg, group 2 patients required significantly fewer interventions (p < .02). Group 1 patients received less sodium (8.0 +/- 4.5 vs. 11.5 +/- 5.0 mmol/kg/day, p = .05) and more fluid on day 1 (2850 +/- 1480 vs. 2180 +/- 770 mL/m2, p = .05). They also had a higher frequency of acute respiratory distress syndrome (four vs. 0 patients, p = .1) and more than two complications (six vs. 1 patient, p = .09). Group 2 patients had significantly shorter ICU stay times (11.6 +/- 6.1 vs. 8.0 +/- 2.4 days; p = .04) and shorter mechanical ventilation times (9.5 +/- 6.0 vs. 6.9 +/- 2.2 days; p = .1). The survival rate and duration of hospital stay were similar in both groups. CONCLUSIONS: Treatment of severe head injury with hypertonic saline is superior to that treatment with lactated Ringer's solution. An increase in serum sodium concentrations significantly correlates with lower ICP and higher CPP. Children treated with hypertonic saline require fewer interventions, have fewer complications, and stay a shorter time in the ICU.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Persons infected with human immunodeficiency virus (HIV) have increased rates of coronary artery disease (CAD). The relative contribution of genetic background, HIV-related factors, antiretroviral medications, and traditional risk factors to CAD has not been fully evaluated in the setting of HIV infection. METHODS: In the general population, 23 common single-nucleotide polymorphisms (SNPs) were shown to be associated with CAD through genome-wide association analysis. Using the Metabochip, we genotyped 1875 HIV-positive, white individuals enrolled in 24 HIV observational studies, including 571 participants with a first CAD event during the 9-year study period and 1304 controls matched on sex and cohort. RESULTS: A genetic risk score built from 23 CAD-associated SNPs contributed significantly to CAD (P = 2.9 × 10(-4)). In the final multivariable model, participants with an unfavorable genetic background (top genetic score quartile) had a CAD odds ratio (OR) of 1.47 (95% confidence interval [CI], 1.05-2.04). This effect was similar to hypertension (OR = 1.36; 95% CI, 1.06-1.73), hypercholesterolemia (OR = 1.51; 95% CI, 1.16-1.96), diabetes (OR = 1.66; 95% CI, 1.10-2.49), ≥ 1 year lopinavir exposure (OR = 1.36; 95% CI, 1.06-1.73), and current abacavir treatment (OR = 1.56; 95% CI, 1.17-2.07). The effect of the genetic risk score was additive to the effect of nongenetic CAD risk factors, and did not change after adjustment for family history of CAD. CONCLUSIONS: In the setting of HIV infection, the effect of an unfavorable genetic background was similar to traditional CAD risk factors and certain adverse antiretroviral exposures. Genetic testing may provide prognostic information complementary to family history of CAD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Although the studies published so far have found an affectation in the Health Related Quality of Life (HRQOL) in both psychiatric and substance use dependence disorders, very few studies have applied HRQOL as an assessment measure in patients suffering both comorbid conditions, or Dual Diagnosis. The aim of the current study was to assess HRQOL in a group of patients with Dual Diagnosis compared to two other non-comorbid groups and to determine what clinical factors are related to HRQOL. Methods: Cross-sectional assessment of three experimental groups was made through the Short Form 36 Item Health Survey (SF-36). The sample consisted of a group with Dual Diagnosis (DD; N=35), one with Severe Mental Illness alone (SMI; N=35) and another one with Substance Use Dependence alone (SUD; N=35). The sample was composed only by males. To assess the clinical correlates of SF-36 HRQOL, lineal regression analyses were carried out. Results: The DD group showed lower scores in most of the subscales, and in the mental health domain. The group with SUD showed in general a better state in the HRQOL while the group with SMI held an intermediate position with respect to the other two groups. Daily medication, suicidal attempts and daily number of coffees were significantly associated to HRQOL, especially in the DD group. Conclusions: The DD group showed lower self-reported mental health quality of life. Assessment of HRQOL in dual patients allows to identify specific needs in this population, and may help to establish therapeutic goals to improve interventions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: In contrast to obesity, information on the health risks of underweight is sparse. We examined the long-term association between underweight and mortality by considering factors possibly influencing this relationship. METHODS: We included 31,578 individuals aged 25-74 years, who participated in population based health studies between 1977 and 1993 and were followed-up for survival until 2008 by record linkage with the Swiss National Cohort (SNC). Body Mass Index (BMI) was calculated from measured (53% of study population) or self-reported height and weight. Underweight was defined as BMI < 18.5 kg/m2. Cox regression models were used to determine mortality Hazard Ratios (HR) of underweight vs. normal weight (BMI 18.5- < 25.0 kg/m2). Covariates were study, sex, smoking, healthy eating proxy, sports frequency, and educational level. RESULTS: Underweight individuals represented 3.0% of the total study population (n = 945), and were mostly women (89.9%). Compared to normal weight, underweight was associated with increased all-cause mortality (HR: 1.37; 95% CI: 1.14-1.65). Increased risk was apparent in both sexes, regardless of smoking status, and mainly driven by excess death from external causes (HR: 3.18; 1.96-5.17), but not cancer, cardiovascular or respiratory diseases. The HR were 1.16 (0.88-1.53) in studies with measured BMI and 1.59 (1.24-2.05) with self-reported BMI. CONCLUSIONS: The increased risk of dying of underweight people was mainly due to an increased mortality risk from external causes. Using self-reported BMI may lead to an overestimation of mortality risk associated with underweight.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objective: Imipenem is a broad spectrum antibiotic used to treat severe infections in critically ill patients. Imipenem pharmacokinetics (PK) was evaluated in a cohort of neonates treated in the Neonatal Intensive Care Unit of the Lausanne University Hospital. The objective of our study was to identify key demographic and clinical factors influencing imipenem exposure in this population. Method: PK data from neonates and infants with at least one imipenem concentration measured between 2002 and 2013 were analyzed applying population PK modeling methods. Measurement of plasma concentrations were performed upon the decision of the physician within the frame of a therapeutic drug monitoring (TDM) programme. Effects of demographic (sex, body weight, gestational age, postnatal age) and clinical factors (serum creatinine as a measure of kidney function; co-administration of furosemide, spironolactone, hydrochlorothiazide, vancomycin, metronidazole and erythromycin) on imipenem PK were explored. Model-based simulations were performed (with a median creatinine value of 46 μmol/l) to compare various dosing regimens with respect to their ability to maintain drug levels above predefined minimum inhibitory concentrations (MIC) for at least 40 % of the dosing interval. Results: A total of 144 plasma samples was collected in 68 neonates and infants, predominantly preterm newborns, with median gestational age of 27 weeks (24 - 41 weeks) and postnatal age of 21 days (2 - 153 days). A two-compartment model best characterized imipenem disposition. Actual body weight exhibited the greatest impact on PK parameters, followed by age (gestational age and postnatal age) and serum creatinine on clearance. They explain 19%, 9%, 14% and 9% of the interindividual variability in clearance respectively. Model-based simulations suggested that 15 mg/kg every 12 hours maintain drug concentrations over a MIC of 2 mg/l for at least 40% of the dosing interval during the first days of life, whereas neonates older than 14 days of life required a dose of 20 mg/kg every 12 hours. Conclusion: Dosing strategies based on body weight and post-natal age are recommended for imipenem in all critically ill neonates and infants. Most current guidelines seem adequate for newborns and TDM should be restricted to some particular clinical situations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Artemisinin-resistant Plasmodium falciparum has emerged in the Greater Mekong sub-region and poses a major global public health threat. Slow parasite clearance is a key clinical manifestation of reduced susceptibility to artemisinin. This study was designed to establish the baseline values for clearance in patients from Sub-Saharan African countries with uncomplicated malaria treated with artemisinin-based combination therapies (ACTs). METHODS: A literature review in PubMed was conducted in March 2013 to identify all prospective clinical trials (uncontrolled trials, controlled trials and randomized controlled trials), including ACTs conducted in Sub-Saharan Africa, between 1960 and 2012. Individual patient data from these studies were shared with the WorldWide Antimalarial Resistance Network (WWARN) and pooled using an a priori statistical analytical plan. Factors affecting early parasitological response were investigated using logistic regression with study sites fitted as a random effect. The risk of bias in included studies was evaluated based on study design, methodology and missing data. RESULTS: In total, 29,493 patients from 84 clinical trials were included in the analysis, treated with artemether-lumefantrine (n = 13,664), artesunate-amodiaquine (n = 11,337) and dihydroartemisinin-piperaquine (n = 4,492). The overall parasite clearance rate was rapid. The parasite positivity rate (PPR) decreased from 59.7 % (95 % CI: 54.5-64.9) on day 1 to 6.7 % (95 % CI: 4.8-8.7) on day 2 and 0.9 % (95 % CI: 0.5-1.2) on day 3. The 95th percentile of observed day 3 PPR was 5.3 %. Independent risk factors predictive of day 3 positivity were: high baseline parasitaemia (adjusted odds ratio (AOR) = 1.16 (95 % CI: 1.08-1.25); per 2-fold increase in parasite density, P <0.001); fever (>37.5 °C) (AOR = 1.50 (95 % CI: 1.06-2.13), P = 0.022); severe anaemia (AOR = 2.04 (95 % CI: 1.21-3.44), P = 0.008); areas of low/moderate transmission setting (AOR = 2.71 (95 % CI: 1.38-5.36), P = 0.004); and treatment with the loose formulation of artesunate-amodiaquine (AOR = 2.27 (95 % CI: 1.14-4.51), P = 0.020, compared to dihydroartemisinin-piperaquine). CONCLUSIONS: The three ACTs assessed in this analysis continue to achieve rapid early parasitological clearance across the sites assessed in Sub-Saharan Africa. A threshold of 5 % day 3 parasite positivity from a minimum sample size of 50 patients provides a more sensitive benchmark in Sub-Saharan Africa compared to the current recommended threshold of 10 % to trigger further investigation of artemisinin susceptibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In nature, variation for example in herbivory, wind exposure, moisture and pollution impact often creates variation in physiological stress and plant productivity. This variation is seldom clear-cut, but rather results in clines of decreasing growth and productivity towards the high-stress end. These clines of unidirectionally changing stress are generally known as ‘stress gradients’. Through its effect on plant performance, stress has the capacity to fundamentally alter the ecological relationships between individuals, and through variation in survival and reproduction it also causes evolutionary change, i.e. local adaptations to stress and eventually speciation. In certain conditions local adaptations to environmental stress have been documented in a matter of just a few generations. In plant-plant interactions, intensities of both negative interactions (competition) and positive ones (facilitation) are expected to vary along stress gradients. The stress-gradient hypothesis (SGH) suggests that net facilitation will be strongest in conditions of high biotic and abiotic stress, while a more recent ‘humpback’ model predicts strongest net facilitation at intermediate levels of stress. Plant interactions on stress gradients, however, are affected by a multitude of confounding factors, making studies of facilitation-related theories challenging. Among these factors are plant ontogeny, spatial scale, and local adaptation to stress. The last of these has very rarely been included in facilitation studies, despite the potential co-occurrence of local adaptations and changes in net facilitation in stress gradients. Current theory would predict both competitive effects and facilitative responses to be weakest in populations locally adapted to withstand high abiotic stress. This thesis is based on six experiments, conducted both in greenhouses and in the field in Russia, Norway and Finland, with mountain birch (Betula pubescens subsp. czerepanovii) as the model species. The aims were to study potential local adaptations in multiple stress gradients (both natural and anthropogenic), changes in plant-plant interactions under conditions of varying stress (as predicted by SGH), potential mechanisms behind intraspecific facilitation, and factors confounding plant-plant facilitation, such as spatiotemporal, ontogenetic, and genetic differences. I found rapid evolutionary adaptations (occurring within a time-span of 60 to 70 years) towards heavy-metal resistance around two copper-nickel smelters, a phenomenon that has resulted in a trade-off of decreased performance in pristine conditions. Heavy-metal-adapted individuals had lowered nickel uptake, indicating a possible mechanism behind the detected resistance. Seedlings adapted to heavy-metal toxicity were not co-resistant to others forms of abiotic stress, but showed co-resistance to biotic stress by being consumed to a lesser extent by insect herbivores. Conversely, populations from conditions of high natural stress (wind, drought etc.) showed no local adaptations, despite much longer evolutionary time scales. Due to decreasing emissions, I was unable to test SGH in the pollution gradients. In natural stress gradients, however, plant performance was in accordance with SGH, with the strongest host-seedling facilitation found at the high-stress sites in two different stress gradients. Factors confounding this pattern included (1) plant size / ontogenetic status, with seedling-seedling interactions being competition dominated and host-seedling interactions potentially switching towards competition with seedling growth, and (2) spatial distance, with competition dominating at very short planting distances, and facilitation being strongest at a distance of circa ¼ benefactor height. I found no evidence for changes in facilitation with respect to the evolutionary histories of plant populations. Despite the support for SGH, it may be that the ‘humpback’ model is more relevant when the main stressor is resource-related, while what I studied were the effects of ‘non-resource’ stressors (i.e. heavy-metal pollution and wind). The results have potential practical applications: the utilisation of locally adapted seedlings and plant facilitation may increase the success of future restoration efforts in industrial barrens as well as in other wind-exposed sites. The findings also have implications with regard to the effects of global change in subarctic environments: the documented potential by mountain birch for rapid evolutionary change, together with the general lack of evolutionary ‘dead ends’, due to not (over)specialising to current natural conditions, increase the chances of this crucial forest-forming tree persisting even under the anticipated climate change.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Understanding the factors controlling fine root respiration (FRR) at different temporal scales will help to improve our knowledge about the spatial and temporal variability of soil respiration (SR) and to improve future predictions of CO2 effluxes to the atmosphere. Here we present a comparative study of how FRR respond to variability in soil temperature and moisture in two widely spread species, Scots pines (Pinus sylvestris L.) and Holm-oaks (HO; Quercus ilex L.). Those two species show contrasting water use strategies during the extreme summer-drought conditions that characterize the Mediterranean climate. The study was carried out on a mixed Mediterranean forest where Scots pines affected by drought induced die-back are slowly being replaced by the more drought resistant HO. FRR was measured in spring and early fall 2013 in excised roots freshly removed from the soil and collected under HO and under Scots pines at three different health stages: dead (D), defoliated (DP) and non-defoliated (NDP). Variations in soil temperature, soil water content and daily mean assimilation per tree were also recorded to evaluate FRR sensibility to abiotic and biotic environmental variations. Our results show that values of FRR were substantially lower under HO (1.26 ± 0.16 microgram CO2 /groot·min) than under living pines (1.89 ± 0.19 microgram CO2 /groot·min) which disagrees with the similar rates of soil respiration previously observed under both canopies and suggest that FRR contribution to total SR varies under different tree species. The similarity of FRR rates under HO and DP furthermore confirms other previous studies suggesting a recent Holm-oak root colonization of the gaps under dead trees. A linear mixed effect model approach indicated that seasonal variations in FRR were best explained by soil temperature (p<0.05) while soil moisture was not exerting any direct control over FRR, despite the low soil moisture values during the summer sampling. Plant assimilation rates were positively related to FRR explaining part of the observed variability (p<0.01). However the positive relations of FRR with plant assimilation occurred mainly during spring, when both soil moisture and plant assimilation rates were higher. Our results finally suggest that plants might be able to maintain relatively high rates of FRR during the sub-optimal abiotic and biotic summer conditions probably thanks to their capacity to re-mobilize carbon reserves and their capacity to passively move water from moister layers to upper layers with lower water potentials (where the FR were collected) by hydraulic lift.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Virtually every cell and organ in the human body is dependent on a proper oxygen supply. This is taken care of by the cardiovascular system that supplies tissues with oxygen precisely according to their metabolic needs. Physical exercise is one of the most demanding challenges the human circulatory system can face. During exercise skeletal muscle blood flow can easily increase some 20-fold and its proper distribution to and within muscles is of importance for optimal oxygen delivery. The local regulation of skeletal muscle blood flow during exercise remains little understood, but adenosine and nitric oxide may take part in this process. In addition to acute exercise, long-term vigorous physical conditioning also induces changes in the cardiovasculature, which leads to improved maximal physical performance. The changes are largely central, such as structural and functional changes in the heart. The function and reserve of the heart’s own vasculature can be studied by adenosine infusion, which according to animal studies evokes vasodilation via it’s a2A receptors. This has, however, never been addressed in humans in vivo and also studies in endurance athletes have shown inconsistent results regarding the effects of sport training on myocardial blood flow. This study was performed on healthy young adults and endurance athletes and local skeletal and cardiac muscle blod flow was measured by positron emission tomography. In the heart, myocardial blood flow reserve and adenosine A2A receptor density, and in skeletal muscle, oxygen extraction and consumption was also measured. The role of adenosine in the control of skeletal muscle blood flow during exercise, and its vasodilator effects, were addressed by infusing competitive inhibitors and adenosine into the femoral artery. The formation of skeletal muscle nitric oxide was also inhibited by a drug, with and without prostanoid blockade. As a result and conclusion, it can be said that skeletal muscle blood flow heterogeneity decreases with increasing exercise intensity most likely due to increased vascular unit recruitment, but exercise hyperemia is a very complex phenomenon that cannot be mimicked by pharmacological infusions, and no single regulator factor (e.g. adenosine or nitric oxide) accounts for a significant part of exercise-induced muscle hyperemia. However, in the present study it was observed for the first time in humans that nitric oxide is not only important regulator of the basal level of muscle blood flow, but also oxygen consumption, and together with prostanoids affects muscle blood flow and oxygen consumption during exercise. Finally, even vigorous endurance training does not seem to lead to supranormal myocardial blood flow reserve, and also other receptors than A2A mediate the vasodilator effects of adenosine. In respect to cardiac work, atheletes heart seems to be luxuriously perfused at rest, which may result from reduced oxygen extraction or impaired efficiency due to pronouncedly enhanced myocardial mass developed to excel in strenuous exercise.