961 resultados para 0.9 per mil were added


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Multislice CT (MSCT) combined with D-dimer measurement can safely exclude pulmonary embolism in patients with a low or intermediate clinical probability of this disease. We compared this combination with a strategy in which both a negative venous ultrasonography of the leg and MSCT were needed to exclude pulmonary embolism. METHODS: We included 1819 consecutive outpatients with clinically suspected pulmonary embolism in a multicentre non-inferiority randomised controlled trial comparing two strategies: clinical probability assessment and either D-dimer measurement and MSCT (DD-CT strategy [n=903]) or D-dimer measurement, venous compression ultrasonography of the leg, and MSCT (DD-US-CT strategy [n=916]). Randomisation was by computer-generated blocks with stratification according to centre. Patients with a high clinical probability according to the revised Geneva score and a negative work-up for pulmonary embolism were further investigated in both groups. The primary outcome was the 3-month thromboembolic risk in patients who were left untreated on the basis of the exclusion of pulmonary embolism by diagnostic strategy. Clinicians assessing outcome were blinded to group assignment. Analysis was per protocol. This study is registered with ClinicalTrials.gov, number NCT00117169. FINDINGS: The prevalence of pulmonary embolism was 20.6% in both groups (189 cases in DD-US-CT group and 186 in DD-CT group). We analysed 855 patients in the DD-US-CT group and 838 in the DD-CT group per protocol. The 3-month thromboembolic risk was 0.3% (95% CI 0.1-1.1) in the DD-US-CT group and 0.3% (0.1-1.2) in the DD-CT group (difference 0.0% [-0.9 to 0.8]). In the DD-US-CT group, ultrasonography showed a deep-venous thrombosis in 53 (9% [7-12]) of 574 patients, and thus MSCT was not undertaken. INTERPRETATION: The strategy combining D-dimer and MSCT is as safe as the strategy using D-dimer followed by venous compression ultrasonography of the leg and MSCT for exclusion of pulmonary embolism. An ultrasound could be of use in patients with a contraindication to CT.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cyclosporine is a substrate of cytochrome P450 (CYP) 3A and of the transporter ABCB1, for which polymorphisms have been described. In particular, CYP3A5 *3/*3 genotype results in the absence of CYP3A5 activity, whereas CYP3A7 *1/*1C genotype results in high CYP3A7 expression in adults. Log-transformed dose-adjusted cyclosporine trough concentration and daily dose per weight were compared 1, 3, 6, and 12 months after transplantation between CYP3A and ABCB1 genotypes in 73 renal (n = 64) or lung (n = 9) transplant recipients. CYP3A5 expressors (*1/*3 genotype; n = 8-10) presented significantly lower dose-adjusted cyclosporine trough concentrations (P < 0.05) and required significantly higher daily doses per weight (P < 0.01) than the nonexpressors (*3/*3 genotype; n = 55-59) 1, 3, 6, and 12 months after transplantation. In addition, 7 days after transplantation, more CYP3A5 expressors had uncorrected trough cyclosporine concentration below the target concentration of 200 ng/mL than the nonexpressors (odds ratio = 7.2; 95% confidence interval = 1.4-37.3; P = 0.009). CYP3A4 rs4646437C>T influenced cyclosporine kinetics, the T carriers requiring higher cyclosporine dose. CYP3A7*1C carriers required a 1.4-fold to 1.6-fold higher cyclosporine daily dose during the first year after transplantation (P < 0.05). In conclusion, CYP3A4, CYP3A5, and CYP3A7 polymorphisms affect cyclosporine metabolism, and therefore, their genotyping could be useful, in association with therapeutic drug monitoring, to prospectively optimize cyclosporine prescription in transplant recipients. The administration of a CYP3A genotype-dependent cyclosporine starting dose should therefore be tested prospectively in a randomized controlled clinical trial to assess whether it leads to an improvement of the patients outcome after transplantation, with adequate immunosuppression and decreased toxicity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CONTEXT: In populations of older adults, prediction of coronary heart disease (CHD) events through traditional risk factors is less accurate than in middle-aged adults. Electrocardiographic (ECG) abnormalities are common in older adults and might be of value for CHD prediction. OBJECTIVE: To determine whether baseline ECG abnormalities or development of new and persistent ECG abnormalities are associated with increased CHD events. DESIGN, SETTING, AND PARTICIPANTS: A population-based study of 2192 white and black older adults aged 70 to 79 years from the Health, Aging, and Body Composition Study (Health ABC Study) without known cardiovascular disease. Adjudicated CHD events were collected over 8 years between 1997-1998 and 2006-2007. Baseline and 4-year ECG abnormalities were classified according to the Minnesota Code as major and minor. Using Cox proportional hazards regression models, the addition of ECG abnormalities to traditional risk factors were examined to predict CHD events. MAIN OUTCOME MEASURE: Adjudicated CHD events (acute myocardial infarction [MI], CHD death, and hospitalization for angina or coronary revascularization). RESULTS: At baseline, 276 participants (13%) had minor and 506 (23%) had major ECG abnormalities. During follow-up, 351 participants had CHD events (96 CHD deaths, 101 acute MIs, and 154 hospitalizations for angina or coronary revascularizations). Both baseline minor and major ECG abnormalities were associated with an increased risk of CHD after adjustment for traditional risk factors (17.2 per 1000 person-years among those with no abnormalities; 29.3 per 1000 person-years; hazard ratio [HR], 1.35; 95% CI, 1.02-1.81; for minor abnormalities; and 31.6 per 1000 person-years; HR, 1.51; 95% CI, 1.20-1.90; for major abnormalities). When ECG abnormalities were added to a model containing traditional risk factors alone, 13.6% of intermediate-risk participants with both major and minor ECG abnormalities were correctly reclassified (overall net reclassification improvement [NRI], 7.4%; 95% CI, 3.1%-19.0%; integrated discrimination improvement, 0.99%; 95% CI, 0.32%-2.15%). After 4 years, 208 participants had new and 416 had persistent abnormalities. Both new and persistent ECG abnormalities were associated with an increased risk of subsequent CHD events (HR, 2.01; 95% CI, 1.33-3.02; and HR, 1.66; 95% CI, 1.18-2.34; respectively). When added to the Framingham Risk Score, the NRI was not significant (5.7%; 95% CI, -0.4% to 11.8%). CONCLUSIONS: Major and minor ECG abnormalities among older adults were associated with an increased risk of CHD events. Depending on the model, adding ECG abnormalities was associated with improved risk prediction beyond traditional risk factors.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The risk of catheter-related infection or bacteremia, with initial and extended use of femoral versus nonfemoral sites for double-lumen vascular catheters (DLVCs) during continuous renal replacement therapy (CRRT), is unclear. STUDY DESIGN: Retrospective observational cohort study. SETTING & PARTICIPANTS: Critically ill patients on CRRT in a combined intensive care unit of a tertiary institution. FACTOR: Femoral versus nonfemoral venous DLVC placement. OUTCOMES: Catheter-related colonization (CRCOL) and bloodstream infection (CRBSI). MEASUREMENTS: CRCOL/CRBSI rates expressed per 1,000 catheter-days. RESULTS: We studied 458 patients (median age, 65 years; 60% males) and 647 DLVCs. Of 405 single-site only DLVC users, 82% versus 18% received exclusively 419 femoral versus 82 jugular or subclavian DLVCs, respectively. The corresponding DLVC indwelling duration was 6±4 versus 7±5 days (P=0.03). Corresponding CRCOL and CRBSI rates (per 1,000 catheter-days) were 9.7 versus 8.8 events (P=0.8) and 1.2 versus 3.5 events (P=0.3), respectively. Overall, 96 patients with extended CRRT received femoral-site insertion first with subsequent site change, including 53 femoral guidewire exchanges, 53 new femoral venipunctures, and 47 new jugular/subclavian sites. CRCOL and CRBSI rates were similar for all such approaches (P=0.7 and P=0.9, respectively). On multivariate analysis, CRCOL risk was higher in patients older than 65 years and weighing >90kg (ORs of 2.1 and 2.2, respectively; P<0.05). This association between higher weight and greater CRCOL risk was significant for femoral DLVCs, but not for nonfemoral sites. Other covariates, including initial or specific DLVC site, guidewire exchange versus new venipuncture, and primary versus secondary DLVC placement, did not significantly affect CRCOL rates. LIMITATIONS: Nonrandomized retrospective design and single-center evaluation. CONCLUSIONS: CRCOL and CRBSI rates in patients on CRRT are low and not influenced significantly by initial or serial femoral catheterizations with guidewire exchange or new venipuncture. CRCOL risk is higher in older and heavier patients, the latter especially so with femoral sites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To evaluate deaths from AIDS-defining malignancies (ADM) and non-AIDS-defining malignancies (nADM) in the D:A:D Study and to investigate the relationship between these deaths and immunodeficiency. DESIGN: Observational cohort study. METHODS: Patients (23 437) were followed prospectively for 104 921 person-years. We used Poisson regression models to identify factors independently associated with deaths from ADM and nADM. Analyses of factors associated with mortality due to nADM were repeated after excluding nADM known to be associated with a specific risk factor. RESULTS: Three hundred five patients died due to a malignancy, 298 prior to the cutoff for this analysis (ADM: n = 110; nADM: n = 188). The mortality rate due to ADM decreased from 20.1/1000 person-years of follow-up [95% confidence interval (CI) 14.4, 25.9] when the most recent CD4 cell count was <50 cells/microl to 0.1 (0.03, 0.3)/1000 person-years of follow-up when the CD4 cell count was more than 500 cells/microl; the mortality rate from nADM decreased from 6.0 (95% CI 3.3, 10.1) to 0.6 (0.4, 0.8) per 1000 person-years of follow-up between these two CD4 cell count strata. In multivariable regression analyses, a two-fold higher latest CD4 cell count was associated with a halving of the risk of ADM mortality. Other predictors of an increased risk of ADM mortality were homosexual risk group, older age, a previous (non-malignancy) AIDS diagnosis and earlier calendar years. Predictors of an increased risk of nADM mortality included lower CD4 cell count, older age, current/ex-smoking status, longer cumulative exposure to combination antiretroviral therapy, active hepatitis B infection and earlier calendar year. CONCLUSION: The severity of immunosuppression is predictive of death from both ADM and nADM in HIV-infected populations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Fossil bones and teeth of Late Pleistocene terrestrial mammals from Rhine River gravels (RS) and the North Sea (NS), that have been exposed to chemically and isotopically distinct diagenetic fluids (fresh water versus seawater), were investigated to study the effects of early diagenesis on biogenic apatite. Changes in phosphate oxygen isotopic composition (delta O-18(PO4)), nitrogen content (wt.% N) and rare earth element (REE) concentrations were measured along profiles within bones that have not been completely fossilized, and in skeletal tissues (bone, dentine, enamel) with different susceptibilities to diagenetic alteration. Early diagenetic changes of elemental and isotopic compositions of apatite in fossil bone are related to the loss of the stabilizing collagen matrix. The REE concentration is negatively correlated with the nitrogen content, and therefore the amount of collagen provides a sensitive proxy for early diagenetic alteration. REE patterns of RS and NS bones indicate initial fossilization in a fresh water fluid with similar REE compositions. Bones from both settings have nearly collagen-free, REE-, U-, F- and Sr-enriched altered outer rims, while the collagen-bearing bone compacta in the central part often display early diagenetic pyrite void-fillings. However, NS bones exposed to Holocene seawater have outer rim delta O-18(PO4) values that are 1.1 to 2.6 parts per thousand higher compared to the central part of the same bones (delta O-18(PO4) = 18.2 +/- 0.9 parts per thousand, n = 19). Surprisingly, even the collagen-rich bone compacta with low REE contents and apatite crystallinity seems altered, as NS tooth enamel (delta O-18(PO4) =15.0 +/- 0.3 parts per thousand, n=4) has about 3%. lower delta O-18(PO4) values, values that are also similar to those of enamel from RS teeth. Therefore, REE concentration, N content and apatite crystallinity are in this case only poor proxies for the alteration of delta O-18(PO4) values. Seawater exposure of a few years up to 8 kyr can change the delta O-18(PO4) values of the bone apatite by > 3 parts per thousand. Therefore, bones fossilized in marine settings must be treated with caution for palaeoclimatic reconstructions. However, enamel seems to preserve pristine delta O-18(PO4) values on this time scale. Using species-specific calibrations for modern mammals, a mean delta O-18(H2O) value can be reconstructed for Late Pleistocene mammalian drinking water of around -9.2 +/- 0.5 parts per thousand, which is similar to that of Late Pleistocene groundwater from central Europe. (c) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Antiresorptive agents such as bisphosphonates induce a rapid increase of BMD during the 1st year of treatment and a partial maintenance of bone architecture. Trabecular Bone Score (TBS), a new grey-level texture measurement that can be extracted from the DXA image, correlates with 3D parameters of bone micro-architecture. Aim: To evaluate the longitudinal effect of antiresorptive agents on spine BMD and on site-matched spine microarchitecture as assessed by TBS. Methods: From the BMD database for Province of Manitoba, Canada, we selected women age >50 with paired baseline and follow up spine DXA examinations who had not received any prior HRT or other antiresorptive drug.Women were divided in two subgroups: (1) those not receiving any HRT or antiresorptive drug during follow up (=non-users) and (2) those receiving non-HRT antiresorptive drug during follow up (=users) with high adherence (medication possession ratio >75%) from a provincial pharmacy database system. Lumbar spine TBS was derived by the Bone Disease Unit, University of Lausanne, for each spine DXA examination using anonymized files (blinded from clinical parameters and outcomes). Effects of antiresorptive treatment for users and non-users on TBS and BMD at baseline and during mean 3.7 years follow-up were compared. Results were expressed % change per year. Results: 1150 non-users and 534 users met the inclusion criteria. At baseline, users and non-users had a mean age and BMI of [62.2±7.9 vs 66.1±8.0 years] and [26.3±4.7 vs 24.7±4.0 kg/m²] respectively. Antiresorptive drugs received by users were bisphosphonates (86%), raloxifene (10%) and calcitonin (4%). Significant differences in BMD change and TBS change were seen between users and nonusers during follow-up (p<0.0001). Significant decreases in mean BMD and TBS (−0.36± 0.05% per year; −0.31±0.06% per year) were seen for non-users compared with baseline (p<0.001). A significant increase in mean BMD was seen for users compared with baseline (+1.86±0.0% per year, p<0.0018). TBS of users also increased compared with baseline (+0.20±0.08% per year, p<0.001), but more slowly than BMD. Conclusion: We observed a significant increase in spine BMD and a positive maintenance of bone micro-architecture from TBS with antiresorptive treatment, whereas the treatment naïve group lost both density and micro-architecture. TBS seems to be responsive to treatment and could be suitable for monitoring micro-architecture. This article is part of a Special Issue entitled ECTS 2011. Disclosure of interest: M.-A. Krieg: None declared, A. Goertzen: None declared, W. Leslie: None declared, D. Hans Consulting fees from Medimaps.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The activity of the renin-angiotensin system is usually evaluated as plasma renin activity (PRA, ngAI/ml per h) but the reproducibility of this enzymatic assay is notoriously scarce. We compared the inter and intralaboratory reproducibilities of PRA with those of a new automated chemiluminescent assay, which allows the direct quantification of immunoreactive renin [chemiluminescent immunoreactive renin (CLIR), microU/ml]. METHODS: Aliquots from six pool plasmas of patients with very low to very high PRA levels were measured in 12 centres with both the enzymatic and the direct assays. The same methods were applied to three control plasma preparations with known renin content. RESULTS: In pool plasmas, mean PRA values ranged from 0.14 +/- 0.08 to 18.9 +/- 4.1 ngAI/ml per h, whereas those of CLIR ranged from 4.2 +/- 1.7 to 436 +/- 47 microU/ml. In control plasmas, mean values of PRA and of CLIR were always within the expected range. Overall, there was a significant correlation between the two methods (r = 0.73, P < 0.01). Similar correlations were found in plasmas subdivided in those with low, intermediate and high PRA. However, the coefficients of variation among laboratories found for PRA were always higher than those of CLIR, ranging from 59.4 to 17.1% for PRA, and from 41.0 to 10.7% for CLIR (P < 0.01). Also, the mean intralaboratory variability was higher for PRA than for CLIR, being respectively, 8.5 and 4.5% (P < 0.01). CONCLUSION: The measurement of renin with the chemiluminescent method is a reliable alternative to PRA, having the advantage of a superior inter and intralaboratory reproducibility.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To determine 1) HIV testing practices in a 1400-bed university hospital where local HIV prevalence is 0.4% and 2) the effect on testing practices of national HIV testing guidelines, revised in March 2010, recommending Physician-Initiated Counselling and Testing (PICT). METHODS: Using 2 hospital databases, we determined the number of HIV tests performed by selected clinical services, and the number of patients tested as a percentage of the number seen per service ('testing rate'). To explore the effect of the revised national guidelines, we examined testing rates for two years pre- and two years post-PICT guideline publication. RESULTS: Combining the clinical services, 253,178 patients were seen and 9,183 tests were performed (of which 80 tested positive, 0.9%) in the four-year study period. The emergency department (ED) performed the second highest number of tests, but had the lowest testing rates (0.9-1.1%). Of inpatient services, neurology and psychiatry had higher testing rates than internal medicine (19.7% and 9.6% versus 8%, respectively). There was no significant increase in testing rates, either globally or in the majority of the clinical services examined, and no increase in new HIV diagnoses post-PICT recommendations. CONCLUSIONS: Using a simple two-database tool, we observe no global improvement in HIV testing rates in our hospital following new national guidelines but do identify services where testing practices merit improvement. This study may show the limit of PICT strategies based on physician risk assessment, compared to the opt-out approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this work was to develop a genetic transformation system for tropical maize genotypes via particle bombardment of immature zygotic embryos. Particle bombardment was carried out using a genetic construct with bar and uidA genes under control of CaMV35S promoter. The best conditions to transform maize tropical inbred lines L3 and L1345 were obtained when immature embryos were cultivated, prior to the bombardment, in higher osmolarity during 4 hours and bombarded at an acceleration helium gas pressure of 1,100 psi, two shots per plate, and a microcarrier flying distance of 6.6 cm. Transformation frequencies obtained using these conditions ranged from 0.9 to 2.31%. Integration of foreign genes into the genome of maize plants was confirmed by Southern blot analysis as well as bar and uidA gene expressions. The maize genetic transformation protocol developed in this work will possibly improve the efficiency to produce new transgenic tropical maize lines expressing desirable agronomic characteristics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES: To investigate the effect of a change in second-hand smoke (SHS) exposure on heart rate variability (HRV) and pulse wave velocity (PWV), this study utilized a quasi-experimental setting when a smoking ban was introduced. METHODS: HRV, a quantitative marker of autonomic activity of the nervous system, and PWV, a marker of arterial stiffness, were measured in 55 non-smoking hospitality workers before and 3-12 months after a smoking ban and compared to a control group that did not experience an exposure change. SHS exposure was determined with a nicotine-specific badge and expressed as inhaled cigarette equivalents per day (CE/d). RESULTS: PWV and HRV parameters significantly changed in a dose-dependent manner in the intervention group as compared to the control group. A one CE/d decrease was associated with a 2.3 % (95 % CI 0.2-4.4; p = 0.031) higher root mean square of successive differences (RMSSD), a 5.7 % (95 % CI 0.9-10.2; p = 0.02) higher high-frequency component and a 0.72 % (95 % CI 0.40-1.05; p < 0.001) lower PWV. CONCLUSIONS: PWV and HRV significantly improved after introducing smoke-free workplaces indicating a decreased cardiovascular risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

CONTEXT: A shortening of the atrial refractory period has been considered as the main mechanism for the increased risk of atrial fibrillation in hyperthyroidism. However, other important factors may be involved. OBJECTIVE: Our objective was to determine the activity of abnormal supraventricular electrical depolarizations in response to elevated thyroid hormones in patients without structural heart disease. PATIENTS AND DESIGN: Twenty-eight patients (25 females, three males, mean age 43+/-11 yr) with newly diagnosed and untreated hyperthyroidism were enrolled in a prospective trial after exclusion of heart disease. Patients were followed up for 16 +/- 6 months and studied at baseline and 6 months after normalization of serum TSH levels. MAIN OUTCOME MEASURES: The incidence of abnormal premature supraventricular depolarizations (SVPD) and the number of episodes of supraventricular tachycardia was defined as primary outcome measurements before the start of the study. In addition, heart rate oscillations (turbulence) after premature depolarizations and heart rate variability were compared at baseline and follow-up. RESULTS: SVPDs decreased from 59 +/- 29 to 21 +/- 8 per 24 h (P = 0.003), very early SVPDs (so called P on T) decreased from 36 +/- 24 to 3 +/- 1 per 24 h (P < 0.0001), respectively, and nonsustained supraventricular tachycardias decreased from 22 +/- 11 to 0.5 +/- 0.2 per 24 h (P = 0.01) after normalization of serum thyrotropin levels. The hyperthyroid phase was characterized by an increased heart rate (93 +/- 14 vs. 79 +/- 8 beats/min, P < 0.0001) and a decreased turbulence slope (3.6 vs. 9.2, P = 0.003), consistent with decreased vagal tone. This was confirmed by a significant decrease of heart rate variability. CONCLUSION: Hyperthyroidism is associated with an increased supraventricular ectopic activity in patients with normal hearts. The activation of these arrhythmogenic foci by elevated thyroid hormones may be an important causal link between hyperthyroidism and atrial fibrillation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Objectives: To study the dental status and treatment needs of institutionalized older adults with chronic mental illness compared to a non-psychiatric control sample. Study Design: The sample size was 100, in which 50 were psychogeriatric patients (study group; SG) classified according to DSM-IV, with a mean age of 69.6 ± 6.7 years, and 50 non-psychiatric patients (control group; CG), with a mean age of 68.3 ± 6.9 years. Clinical oral health examinations were conducted and caries were recorded clinically using the Decayed, Missing and Filled Teeth Index (DMFT). Results were analyzed statistically using the Student"s t-test or analysis of variance. Results: Caries prevalence was 58% and 62% in SG and CG, respectively. DMFT index was 28.3 ± 6.6 in SG and 21.4 ± 6.07 in CG (p < 0.01). Mean number of decayed teeth was higher in SG (3.1) compared to CG (1.8) (p=0.047). Mean number of missing teeth were 25.2 and 16.4 in SG and CG respectively (p<0.05). DMFT scores were higher in SG in all the age groups (p < 0.01). Mean number of teeth per person needing treatment was 3.4 in SG and 1.9 in CG (p= 0.037). The need for restorative dental care was significantly lower in the SG (0.8 teeth per person) than in the CG (1.7 teeth per person) (p = 0.043). Conclusions: Institutionalized psychiatric patients have significantly worse dental status and more dental treatment needs than non-psychiatric patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: The objective of this study was to investigate the effects of chronic and intermittent hypoxia on myocardial morphology. METHODS: Rats randomly divided into 3 groups (n = 14 per group) were exposed to room air (Fio(2) = 0.21), chronic hypoxia (Fio(2) = 0.10), and intermittent hypoxia (chronic hypoxia with 1 hour per day of room air) for 2 weeks. Weight, blood gas analysis, hematocrit, hemoglobin, red cells, and right and left ventricular pressures were measured. Hearts excised for morphologic examination were randomly divided into 2 groups (9 per group for gross morphologic measurements and 5 per group for histologic and morphometric analysis). The weight ratio of right to left ventricles plus interventricular septum, myocyte diameter, cross-sectional area, and free wall thickness in right and left ventricles were measured. RESULTS: Despite the same polycythemia, the right ventricle pressure (P <.05) and ratio of right to left ventricle pressures (P <.02) were higher after chronic hypoxia than intermittent hypoxia. The ratio of heart weight to total body weight and the ratio of right to left ventricles plus interventricular septum was higher (P <.01) in chronic and intermittent hypoxia than in normoxia. Myocyte diameter was not different between the right and left ventricles in normoxia, whereas right ventricle myocytes were larger than left ventricle myocytes in chronic hypoxia (P <.05) and intermittent hypoxia (P <.0005). There was marked dilatation of right ventricle size (P <.001) and marked reduction of left ventricle (P <.001) size in chronic and intermittent hypoxia compared with normoxia. The total ventricular area (right ventricle plus left ventricle area) remained the same in all groups. The wall thickness ratio in chronic hypoxia and intermittent hypoxia was increased (P <.001) compared with normoxia in the right ventricle but not in the left ventricle. CONCLUSIONS: Intermittent reoxygenation episodes do not induce a lesser ventricular hypertrophic response than observed with chronic hypoxia. The functional myocardial preconditioning consequence of intermittent reoxygenation is not supported by structural differences evident with the available techniques.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The deposition of Late Pleistocene and Holocene sediments in the high-altitude lake Meidsee (located at an altitude of 2661 m a.s.l. in the Southwestern Alps) strikingly coincided with global ice-sheet and mountain-glacier decay in the Alpine forelands and the formation of perialpine lakes. Radiocarbon ages of bottom-core sediments point out (pre-) Holocene ice retreat below 2700 m a.s.l., at about 16, 13, 10, and 9 cal. kyr BP. The Meidsee sedimentary record therefore provides information about the high-altitude Alpine landscape evolution since the Late Pleistocene/Holocene deglaciation in the Swiss Southwestern Alps. Prior to 5 cal. kyr BP, the C/N ratio and the isotopic composition of sedimentary organic matter (delta N-15(org), delta C-13(org)) indicate the deposition of algal-derived organic matter with limited input of terrestrial organic matter. The early Holocene and the Holocene climatic optimum (between 7.0 and 5.5 cal. kyr BP) were characterized by low erosion (decreasing magnetic susceptibility, chi) and high content of organic matter (C-org > 13 wt.%), enriched in C-13(org) (>-18 parts per thousand) with a low C/N (similar to 10) ratio, typical of modern algal matter derived from in situ production. During the late Holocene, there was a long-term increasing contribution of terrestrial organic matter into the lake (C/N > 11), with maxima between 2.4 and 0.9 cal. kyr BP. A major environmental change took place 800 years ago, with an abrupt decrease in the relative contribution of terrestrial organic material into the lake compared with aquatic organic material which subsequently largely dominated (C/N drop from 16 to 10). Nonetheless, this event was marked by a rise in soil erosion (chi), in nutrients input (N and P contents) and in anthropogenic lead deposition, suggesting a human disturbance of Alpine ecosystems 800 years ago. Indeed, this time period coincided with the migration of the Walser Alemannic people in the region, who settled at relatively high altitude in the Southwestern Alps for farming and maintaining Alpine passes.