949 resultados para Atrophy due to disuse
Resumo:
After major volcanic eruptions the enhanced aerosol causes ozone changes due to greater heterogeneous chemistry on the particle surfaces (HET-AER) and from dynamical effects related to the radiative heating of the lower stratosphere (RAD-DYN). We carry out a series of experiments with an atmosphere–ocean–chemistry–climate model to assess how these two processes change stratospheric ozone and Northern Hemispheric (NH) polar vortex dynamics. Ensemble simulations are performed under present day and preindustrial conditions, and with aerosol forcings representative of different eruption strength, to investigate changes in the response behaviour. We show that the halogen component of the HET-AER effect dominates under present-day conditions with a global reduction of ozone (−21 DU for the strongest eruption) particularly at high latitudes, whereas the HET-AER effect increases stratospheric ozone due to N2O5 hydrolysis in a preindustrial atmosphere (maximum anomalies +4 DU). The halogen-induced ozone changes in the present-day atmosphere offset part of the strengthening of the NH polar vortex during mid-winter (reduction of up to −16 m s-1 in January) and slightly amplify the dynamical changes in the polar stratosphere in late winter (+11 m s-1 in March). The RAD-DYN mechanism leads to positive column ozone anomalies which are reduced in a present-day atmosphere by amplified polar ozone depletion (maximum anomalies +12 and +18 DU for present day and preindustrial, respectively). For preindustrial conditions, the ozone response is consequently dominated by RAD-DYN processes, while under present-day conditions, HET-AER effects dominate. The dynamical response of the stratosphere is dominated by the RAD-DYN mechanism showing an intensification of the NH polar vortex in winter (up to +10 m s-1 in January). Ozone changes due to the RAD-DYN mechanism slightly reduce the response of the polar vortex after the eruption under present-day conditions.
Resumo:
As long as global CO₂ emissions continue to increase annually, long-term committed Earth system changes grow much faster than current observations. A novel metric linking this future growth to policy decisions today is the mitigation delay sensitivity (MDS), but MDS estimates for Earth system variables other than peak temperature (ΔT max) are missing. Using an Earth System Model of Intermediate Complexity, we show that the current emission increase rate causes a ΔT max increase roughly 3–7.5 times as fast as observed warming, and a millenial steric sea level rise (SSLR) 7–25 times as fast as observed SSLR, depending on the achievable rate of emission reductions after the peak of emissions. These ranges are only slightly affected by the uncertainty range in equilibrium climate sensitivity, which is included in the above values. The extent of ocean acidification at the end of the century is also strongly dependent on the starting time and rate of emission reductions. The preservable surface ocean area with sufficient aragonite supersaturation for coral reef growth is diminished globally at an MDS of roughly 25%–80% per decade. A near-complete loss of this area becomes unavoidable if mitigation is delayed for a few years to decades. Also with respect to aragonite, 12%–18% of the Southern Ocean surface become undersaturated per decade, if emission reductions are delayed beyond 2015–2040. We conclude that the consequences of delaying global emission reductions are much better captured if the MDS of relevant Earth system variables is communicated in addition to current trends and total projected future changes.
Resumo:
Major volcanic eruptions generate widespread ocean cooling, which reduces upper ocean stratification. This effect has the potential to increase nutrient delivery into the euphotic zone and boost biological productivity. Using externally forced last millennium simulations of three climate/Earth System models (Model for Interdisciplinary Research On Climate (MIROC), Community Earth System Model (CESM), and LOch-Vecode-Ecbilt-CLio-agIsm Model (LOVECLIM)), we test the hypothesis that large volcanic eruptions intensify nutrient-driven export production. It is found that strong volcanic radiative forcing enhances the likelihood of eastern Pacific El Niño-like warming in CESM and LOVECLIM. This leads to an initial reduction of nutrients and export production in the eastern equatorial Pacific. However, this initial response reverses after about 3 years in association with La Niña cooling. The resulting delayed enhancement of biological production resembles the multiyear response in MIROC. The model simulations show that volcanic impacts on tropical Pacific dynamics and biogeochemistry persist for several years, thus providing a new source for potential multiyear ecosystem predictability.
Resumo:
Eringer cows are often slaughtered due to fertility problems which result from inflammatory and degenerative changes of the uterus or hormonal imbalances. Twenty-one genital tracts from Eringer cows suffering from fertility problems were collected in the abattoir. The purpose of the study was the macroscopic evaluation of the ovaries and the uterus followed by a histological and microbiological analysis of the uterus. Data from inseminations and calvings were provided by the Eringer breeding association and through the internet portal www.agate.ch. Median age of the cows was 6.9 years, number of calves per cow was 2.5 and median period between last calving and slaughter was 1.5 years. In 13 from 21 of the urogenital tracts examined, macroscopic abnormalities of the ovaries and/or histologic or microbiologic findings in the uterus could explain fertility-associated slaughter.
Resumo:
BACKGROUND Lung clearance index (LCI), a marker of ventilation inhomogeneity, is elevated early in children with cystic fibrosis (CF). However, in infants with CF, LCI values are found to be normal, although structural lung abnormalities are often detectable. We hypothesized that this discrepancy is due to inadequate algorithms of the available software package. AIM Our aim was to challenge the validity of these software algorithms. METHODS We compared multiple breath washout (MBW) results of current software algorithms (automatic modus) to refined algorithms (manual modus) in 17 asymptomatic infants with CF, and 24 matched healthy term-born infants. The main difference between these two analysis methods lies in the calculation of the molar mass differences that the system uses to define the completion of the measurement. RESULTS In infants with CF the refined manual modus revealed clearly elevated LCI above 9 in 8 out of 35 measurements (23%), all showing LCI values below 8.3 using the automatic modus (paired t-test comparing the means, P < 0.001). Healthy infants showed normal LCI values using both analysis methods (n = 47, paired t-test, P = 0.79). The most relevant reason for false normal LCI values in infants with CF using the automatic modus was the incorrect recognition of the end-of-test too early during the washout. CONCLUSION We recommend the use of the manual modus for the analysis of MBW outcomes in infants in order to obtain more accurate results. This will allow appropriate use of infant lung function results for clinical and scientific purposes.
Resumo:
Schwarzsee is located in the western Swiss Alps, in a region that has been affected by numerous landslides during the Holocene, as evidenced by geological surveys. Lacustrine sediments were cored to a depth of 13 m. The vegetation history of the lake's catchment was reconstructed and investigated to identify possible impacts on slope stability. The pollen analyses record development of forest cover during the middle and late Holocene, and provide strong evidence for regional anthropogenic influence such as forest clearing and agricultural activity. Vegetation change is characterized by continuous landscape denudation that begins at ca. 4300 cal. yrs BP, with five distinct pulses of increased deforestation, at 3650, 2700, 1500, 900, and 450 cal. yrs BP. Each pulse can be attributed to increased human impact, recorded by the appearance or increase of specific anthropogenic indicator plant taxa. These periods of intensified deforestation also appear to be correlated with increased landslide activity in the lake's catchment and increased turbidite frequency in the sediment record. Therefore, this study gives new evidence for a strong influence of vegetation changes on slope stability during the middle and late Holocene in the western Swiss Alps, and may be used as a case study for anthropogenically induced landslide activity.
Resumo:
Genetic diversity in plant populations has been shown to affect the species diversity of insects. In grasses, infection with fungal endophytes can also have strong effects on insects, potentially modifying the effects of plant genetic diversity. We manipulated the genetic diversity and endophyte infection of a grass in a field experiment. We show that diversity of primary parasitoids (3rd trophic level) and, especially, secondary parasitoids (4th trophic level) increases with grass genetic diversity while there was no effect of endophyte infection. The increase in insect diversity appeared to be due to a complementarity effect rather than a sampling effect. The higher parasitoid diversity could not be explained by a cascading diversity effect because herbivore diversity was not affected and the same herbivore species were present in all treatments. The effects on the higher trophic levels must therefore be due to a direct response to plant traits or mediated by effects on traits at intermediate trophic levels.
Resumo:
Abstract. We resumed mowing in two plots of ca. 100 m2 in an abandoned meadow dominated by Brachypodium pinnatum on the slope of Monte Generoso (Switzerland). We monitored species composition and hay yield using point quadrats and biomass samples. Species frequencies changed little during 10 yr (1988–1997) while hay yields showed large fluctuations according to mean relative humidity in April-June. We performed a seed-addition experiment to test whether the establishment of meadow species is limited by lack of diaspores or favourable microsites for germination and recruitment from the seed bank. We sowed ca. 12 000 seeds of 12 species originating from a nearby meadow individually in plots of a 4 × 6 unbalanced Latin square with four treatments, burning, mowing, mowing and removal of a layer of decayed organic matter, and a control. We monitored the fate of seedling individuals for 24 months. Seedlings of all species were established and survived for 12 months, 10 species survived during at least 24 months, some reached a reproductive stage. Species responded to different qualities of microsites provided by the different treatments thus required different regeneration niches. Spontaneous long-distance immigration was insignificant. We conclude that the former species composition of abandoned meadows cannot easily be restored by mowing alone because many plant species of meadows do not have persistent seed banks and immigration over distances of more than 25 m and successful establishment is very unlikely.
Resumo:
The ability to respond plastically to the environment has allowed amphibians to evolve a response to spatial and temporal variation in predation threat (Benard 2004). Embroys exposed to egg predation are expected to hatch out earlier than their conspecifics. Larval predation can induce a suite of phenotypic changes including growing a larger tail area. When presented with cues from both egg and larval predators, embryos are expected to respond to the egg predator by hatching out earlier because the egg predator presents an immediate threat. However, hatching early may be costly in the larval environment in terms of development, morphology, and/or behavior. We created a laboratory experiment in which we exposed clutches of spotted salamander (Ambystoma maculatum) eggs to both egg (caddisfly larvae) and larval (A. opacum) predators to test this hypothesis. We recorded hatching time and stage and took developmental and morphological data of the animals a week after hatching. Larvae were entered into lethal predation trials with a larval predatory sunfish (Lepomis sp.) in order to study behavior. We found that animals exposed to the egg predator cues hatched out earlier and at earlier developmental stages than conspecifics regardless of whether there was a larval predator present. Animals exposed to larval predator cues grew relatively larger tails and survived longer in the lethal predation trials. However the group exposed to both predators showed a cost of early hatching in terms of lower tail area and shorter survival time in predation trials. The morphological and developmental effects measured of hatching plasticity were transient as there were no developmental or morphological differences between the treatment groups at metamorphosis. Hatching plasticity may be transient but it is important to the development and survival of many amphibians.
Resumo:
Background. Cardiac tamponade can occur when a large amount of fluid, gas, singly or in combination, accumulating within the pericardium, compresses the heart causing circulatory compromise. Although previous investigators have found the 12-lead ECG to have a poor predictive value in diagnosing cardiac tamponade, very few studies have evaluated it as a follow up tool for ruling in or ruling out tamponade in patients with previously diagnosed malignant pericardial effusions. ^ Methods. 127 patients with malignant pericardial effusions at the MD Anderson Cancer Center were included in this retrospective study. While 83 of these patients had a cardiac tamponade diagnosed by echocardiographic criteria (Gold standard), 44 did not. We computed the sensitivity (Se), specificity (Sp), positive (PPV) and negative predictive values (NPV) for individual and combinations of ECG abnormalities. Individual ECG abnormalities were also entered singly into a univariate logistic regression model to predict tamponade. ^ Results. For patients with effusions of all sizes, electrical alternans had a Se, Sp, PPV and NPV of 22.61%, 97.61%, 95% and 39.25% respectively. These parameters for low voltage complexes were 55.95%, 74.44%, 81.03%, 46.37% respectively. The presence of all three ECG abnormalities had a Se = 8.33%, Sp = 100%, PPV = 100% and NPV = 35.83% while the presence of at least one of the three ECG abnormalities had a Se = 89.28%, Sp = 46.51%, PPV = 76.53%, NPV = 68.96%. For patients with effusions of all sizes electrical alternans had an OR of 12.28 (1.58–95.17, p = 0.016), while the presence of at least one ECG abnormality had an OR of 7.25 (2.9–18.1, p = 0.000) in predicting tamponade. ^ Conclusions. Although individual ECG abnormalities had low sensitivities, specificities, NPVs and PPVs with the exception of electrical alternans, the presence of at least one of the three ECG abnormalities had a high sensitivity in diagnosing cardiac tamponade. This could point to its potential use as a screening test with a correspondingly high NPV to rule out a diagnosis of tamponade in patients with malignant pericardial effusions. This could save expensive echocardiographic assessments in patients with previously diagnosed pericardial effusions. ^
Resumo:
A face to face survey addressing environmental risk perception was conducted in January through March 2010. The 35 question survey was administered to a random sample of 73 households in El Paso, Texas. The instrument, administered in two adjacent residential communities neighboring an inactive copper smelter solicited responses about manmade and naturally occurring health risks and sources of health information that might utilized by respondents. The objective of the study was to determine if intervention which occurred in one of the communities increased residents' perception of risk to themselves and their families. The study was undertaken subsequent to increased attention from news media and public debate surrounding the request to reopen the smelter's operations. Results of the study indicated that the perception of environmental related health concerns were not significantly correlated with residence in a community receiving outreach and intervention. Both communities identified sun exposure as their greatest perceived environmental risk followed by cigarette smoking. Though industrial by products and chemical pollution were high ranking concerns, respondents indicated they felt that the decision not to reopen the smelter reduced risk in these areas. Residents expressed confidence in information received from the local health district though most indicated they received very little information from that source indicating an opportunity for public health education in this community as a strategy to address future health concerns.^
Resumo:
This study compared the reported isolations of Mycobacterium kansasii (MK) and Myobacterium avium-intracellulare (MAI) with Mycobacterium tuberculosis (MTB) between 1977-1983 in Texas. A total of 15,395 mycobacterial cases were identified of which 1,352 (8.8%) were MK or MAI. The incidence of MK was higher in urban areas than nonurban areas (p < .005). The incidence of MAI has increased in the Dallas metroplex from 34 cases to 251 for the same time period. Although the number of MK cases previously reported has always exceeded those of MAI, the numbers were equal in the last year (1983) of the study.^ More than 75% of patients with MK or MAI were Caucasians compared to only 18% of patients with MTB. Male to female ratios for MK and MAI are 3:1 and 3:2, respectively. The age distribution of MK patients were an average of 5 years younger than patients with MAI, a finding which concurs with previous studies. MK and MAI pulmonary infections continue to be absent among children and relatively absent among Hispanics.^ MK appears to be associated with occupations in construction, whereas MAI is more often associated with farm work. ^
Resumo:
Current measures of the health impact of epidemic influenza are focused on analyses of death certificate data which may underestimate the true health effect. Previous investigations of influenza-related morbidity have either lacked virologic confirmation of influenza activity in the community or were not population-based. Community virologic surveillance in Houston has demonstrated that influenza viruses have produced epidemics each year since 1974. This study examined the relation of hospitalized for Acute Respiratory Disease (ARD) to the occurrence of influenza epidemics. Considering only Harris County residents, a total of 13,297 ARD hospital discharge records from hospitals representing 48.4% of Harris County hospital beds were compiled for the period July 1978 through June 1981. Variables collected from each discharge included: age, sex, race, dates of admission and discharge, length of stay, discharge disposition and a maximum of five diagnoses. This three year period included epidemics caused by Influenza A/Brazil (H1N1), Influenza B/Singapore, Influenza A/England (H1N1) and Influenza A/Bangkok (H3N2).^ Correlations of both ARD and pneumonia or influenza hospitalizations with indices of community morbidity (specifically, the weekly frequency of virologically-confirmed influenza virus infections) are consistently strong and suggest that hospitalization data reflect the pattern of influenza activity derived from virologic surveillance.^ While 65 percent of the epidemic period hospital deaths occurred in patients who were 65 years of age or older, fewer than 25 percent of epidemic period ARD hospitalizations occurred in persons of that age group. Over 97 percent of epidemic period hospital deaths were accompanied by a chronic underlying illness, however, 45 percent of ARD hospitalizations during epidemics had no mention of underlying illness. Over 2500 persons, approximately 35 percent of all persons hospitalized during the three epidemics, would have been excluded in an analysis for high risk candidates for influenza prophylaxis.^ These results suggest that examination of hospitalizations for ARD may better define the population-at-risk for serious morbidity associated with epidemic influenza. ^
Resumo:
Response to pharmacological treatment is variable among individuals. Some patients respond favorably to a drug while others develop adverse reactions. Early investigations showed evidence of variation in genes that code for drug receptors, drug transporters, and drug metabolizing enzymes; and pharmacogenetics appeared as the science that studies the relationship between drug response and genetic variation. Thiazide diuretics are the recommended first-line monotherapy for hypertension (i.e. SBP>140 or DBP>90). Even so, diuretics are associated with adverse metabolic side effects, such as hyperglycemia, which increase the risk of developing type 2 diabetes. Published approaches testing variation in candidate genes (e.g. the renin-angiotensin-aldosteron system (RAAS) and salt–sensitivity genes) have met with only limited success. We conducted the first genome wide association study to identify genes influencing hyperglycemia as an adverse effect of thiazide diuretics in non-Hispanic White hypertensive patients participating in the Genetic Epidemiology of Responses to Antihypertensives (GERA) and Pharmacogenomic Evaluation of Antihypertensive Responses (PEAR) clinical trials. No SNP reached the a priori defined threshold of statistical significance (p<5x10-8). We detected 50 SNPs in 9 genomic regions with suggestive p-values (p<1x10-5). Two of them, rs6870564 (p-value=3.28 X 10-6) and rs7702121 (p-value=5.09 X 10-6), were located close to biologic candidate genes, MYO and MGAT1, and one SNP in a genomic region in chromosome 6, rs7762018 (p-value=4.59 X 10-6) has been previously related to Insulin-Dependent Diabetes Mellitus (IDDM8). I conclude that 1) there are unlikely to be common SNPs with large effects on the adverse metabolic effects to hydrochlorothiazide treatment and 2) larger sample sizes are needed for pharmacogenetic studies of inter-individual variation in response to commonly prescribed medication.
Resumo:
Objectives. To examine the association between prior rifamycin exposure and later development of C. difficile infection (CDI) caused by a rifamycin-resistant strain of C. difficile , and to compare patient characteristics between rifamycin-resistant strains of C. difficile infection and rifamycin-susceptible strains of C. difficile infection. ^ Methods. A case-control study was performed in a large university-affiliated hospital in Houston, Texas. Study subjects were patients with C. difficile infection acquired at the hospital with culture-positive isolates of C. difficile with which in vitro rifaximin and rifampin susceptibility has been tested. Prior use of rifamycin, demographic and clinical characteristics was compared between case and control groups using univariate statistics. ^ Results. A total of 49 C. difficile strains met the study inclusion criteria for rifamycin-resistant case isolates, and a total of 98 rifamycin-susceptible C. difficile strains were matched to case isolates. Of 49 case isolates, 12 (4%) were resistant to rifampin alone, 12 (4%) were resistant to rifaximin alone, and 25 (9%) were resistant to both rifampin and rifaximin. There was no significant association between prior rifamycin use and rifamycin-resistant CDI. Cases and controls did not differ according to demographic characteristics, length of hospital stay, known risk factors of CDI, type of CDI-onset, and pre-infection medical co-morbidities. Our results on 37 rifaximin-resistant isolates (MIC ≥32 &mgr;g/ml) showed more than half of isolates had a rifaximin MIC ≥256 &mgr;g/ml, and out of these isolates, 19 isolates had MICs ≥1024 &mgr;g/ml. ^ Conclusions. Using a large series of rifamycin-non-susceptible isolates, no patient characteristics were independently associated with rifamycin-resistant CDI. This data suggests that factors beyond previous use of rifamycin antibiotics are primary risk factors for rifamycin-resistant C. difficile. ^