76 resultados para Decline in fertility
Resumo:
BACKGROUND AND OBJECTIVES: Data suggest that atorvastatin may be nephroprotective. This subanalysis of the Treating to New Targets study investigated how intensive lipid lowering with 80 mg of atorvastatin affects renal function when compared with 10 mg in patients with coronary heart disease. DESIGN, SETTING, PARTICIPANTS, ; MEASUREMENTS: A total of 10,001 patients with coronary heart disease and LDL cholesterol levels of <130 mg/dl were randomly assigned to double-blind therapy with 10 or 80 mg/d atorvastatin. Estimated GFR using the Modification of Diet in Renal Disease equation was compared at baseline and at the end of follow-up in 9656 participants with complete renal data. RESULTS: Mean estimated GFR at baseline was 65.6 +/- 11.4 ml/min per 1.73 m2 in the 10-mg group and 65.0 +/- 11.2 ml/min per 1.73 m2 in the 80-mg group. At the end of follow-up (median time to final creatinine measurement 59.5 months), mean change in estimated GFR showed an increase of 3.5 +/- 0.14 ml/min per 1.73 m2 with 10 mg and 5.2 +/- 0.14 ml/min per 1.73 m2 with 80 mg (P < 0.0001 for treatment difference). In the 80-mg arm, estimated GFR improved to > or = 60 ml/min per 1.73 m2 in significantly more patients and declined to < 60 ml/min per 1.73 m2 in significantly fewer patients than in the 10-mg arm. CONCLUSIONS: The expected 5-yr decline in renal function was not observed. Estimated GFR improved in both treatment groups but was significantly greater with 80 mg than with 10 mg, suggesting this benefit may be dosage related.
Resumo:
OBJECTIVE: To assess the long-term effect of HAART on non-Hodgkin lymphoma (NHL) incidence in people with HIV (PHIV). DESIGN: Follow-up of the Swiss HIV Cohort Study (SHCS). METHODS: Between 1984 and 2006, 12 959 PHIV contributed a total of 75 222 person-years (py), of which 36 787 were spent under HAART. Among these PHIV, 429 NHL cases were identified from the SHCS dataset and/or by record linkage with Swiss Cantonal Cancer Registries. Age- and gender-standardized incidence was calculated and Cox regression was used to estimate hazard ratios (HR). RESULTS: NHL incidence reached 13.6 per 1000 py in 1993-1995 and declined to 1.8 in 2002-2006. HAART use was associated with a decline in NHL incidence [HR = 0.26; 95% confidence interval (CI), 0.20-0.33], and this decline was greater for primary brain lymphomas than other NHL. Among non-HAART users, being a man having sex with men, being 35 years of age or older, or, most notably, having low CD4 cell counts at study enrollment (HR = 12.26 for < 50 versus >or= 350 cells/microl; 95% CI, 8.31-18.07) were significant predictors of NHL onset. Among HAART users, only age was significantly associated with NHL risk. The HR for NHL declined steeply in the first months after HAART initiation (HR = 0.46; 95% CI, 0.27-0.77) and was 0.12 (95% CI, 0.05-0.25) 7 to10 years afterwards. CONCLUSIONS: HAART greatly reduced the incidence of NHL in PHIV, and the influence of CD4 cell count on NHL risk. The beneficial effect remained strong up to 10 years after HAART initiation.
Resumo:
The diagnosis of the obliterative bronchiolitis syndrome in lung transplantation is presently best established by evaluation of postoperative lung function tests. Unfortunately the decline in lung function occurs only when obliteration has progressed significantly and is therefore not an early predictive indicator. To distinguish patients at increased risk for the development of obliterative bronchiolitis, we regularly assessed the chemiluminescence response of polymorphonuclear leukocytes, opsonic capacity, and plasma elastase/beta-N-acetylglucosaminidase in 52 outpatients (25 women and 27 men; mean age 45 +/- 12 years) who underwent transplantation between January 1991 and January 1992. Recent onset bronchiolitis within the described observation period occurred in 16 patients (group obliterative bronchiolitis). A matched cohort of 16 patients was formed according to type of procedure, age and follow-up (control) from the remaining 36 patients. Data obtained from a period 6 months before clinical onset of the syndrome showed a significant drop of the opsonic capacity (group obliterative bronchiolitis = 87% +/- 7%; control = 100% +/- 9%; p < 0.023) and rise of the N-acetyl-D-glucosaminidase (group obliterative bronchiolitis = 7.5 +/- 2 U/L; control = 5.8 +/- 1.8 U/L; p < 0.04). No correlation was found between the number of infectious events or rejection episodes and the incidence of obliterative bronchiolitis. According to these results, it can be concluded that a decrease in the plasma opsonic capacity and a rise in beta-N-acetylglucosaminidase may be early markers before clinical onset of obliterative bronchiolitis. The nonspecific immune system may therefore play an important role in the development of obliterative bronchiolitis.
Resumo:
Analyses of pollen, macrofossils and microscopic charcoal in the sediment of a small sub-alpine lake (Karakol, Kyrgyzstan) provide new data to reconstruct the vegetation history of the Kungey Alatau spruce forest during the late-Holocene, i.e. the past 4,000 years. The pollen data suggest that Picea schrenkiana F. and M. was the dominant tree in this region from the beginning of the record. The pollen record of pronounced die-backs of the forests, along with lithostratigraphical evidence, points to possible climatic cooling (and/or drying) around 3,800 cal year B.P., and between 3,350 and 2,520 cal year B.P., with a culmination at 2,800-2,600 cal B.P., although stable climatic conditions are reported for this region for the past 3,000-4,000 years in previous studies. From 2,500 to 190 cal year B.P. high pollen values of P. schrenkiana suggest rather closed and dense forests under the environmental conditions of that time. A marked decline in spruce forests occurred with the onset of modern human activities in the region from 190 cal year B.P. These results show that the present forests are anthropogenically reduced and represent only about half of their potential natural extent. As P. schrenkiana is a species endemic to the western Tien Shan, it is most likely that its refugium was confined to this region. However, our palaeoecological record is too recent to address this hypothesis thoroughly.
Resumo:
The occurrence of sudden cardiac death (SCD) in patients with silent ischemia after myocardial infarction (MI) and the factors facilitating SCD are unknown. This study aimed to determine the factors facilitating SCD in patients with silent ischemia after MI. In the Swiss Interventional Study on Silent Ischemia Type II (SWISSI II), 201 patients with silent ischemia after MI were randomized to percutaneous coronary intervention (PCI) or medical management. The main end point of the present analysis was SCD. Multivariable regression models were used to detect potential associations between baseline or follow-up variables and SCD. During a mean follow-up of 10.3 +/- 2.6 years, 12 SCDs occurred, corresponding to an average annual event rate of 0.6%. On multivariate regression analysis, the decline in the left ventricular ejection fraction (LVEF) during follow-up was the only independent predictor of SCD (p = 0.011), other than age; however, the baseline LVEF was not. The decline in LVEF was greater in patients receiving medical management than in those who had received PCI (p <0.001), as well as in patients with residual myocardial ischemia or recurrent MI compared with patients without these findings (p = 0.038 and p <0.001, respectively). Compared with medical management, PCI reduced the rate of residual myocardial ischemia (p <0.001) and recurrent MI (p = 0.001) during follow-up. In conclusion, patients with silent ischemia after MI are at a substantial risk of SCD. The prevention of residual myocardial ischemia and recurrent MI using PCI resulted in better long-term LVEF and a reduced SCD incidence.
Resumo:
OBJECTIVE: We report the results and complications associated with standardized intraoperative management designed for the prevention of hemodynamically relevant venous air embolism during surgery performed in the semisitting position. METHODS: A protocol for preoperative evaluation and intraoperative monitoring was developed and applied in 187 consecutive patients who underwent surgery in the semisitting position between 1999 and 2004. The protocol included preoperative transesophageal echocardiography examination (TEE), intraoperative TEE monitoring, catheterization of the right atrium and a combination of fluid input, positive end expiratory pressure, and standardized positioning aiming at a positive pressure in the transverse and sigmoid sinuses. Data were collected retrospectively from the charts and intraoperative anesthesiological protocols of the patients for the incidence of clinically relevant air embolism (i.e., TEE-diagnosed air embolism plus a decrease in end tidal CO2 or hemodynamic changes) and other complications related to the semisitting position. RESULTS: Three cases (1.6%) of relevant venous air embolism occurred in 187 patients. Only 1 case (0.5%) was hemodynamically relevant, with temporary arterial blood pressure decrease and heart rate increase. Pneumatocephalus leading to lethargy was a frequent postoperative finding, which resolved spontaneously in all except 1 patient with epileptic seizure and oculomotor nerve palsy attributable to space-occupying subdurally trapped air, which had to be treated surgically. There was no permanent morbidity or mortality related to the semisitting position. CONCLUSION: Fear of massive venous air embolism is one reason for dramatic decline in the use of the semisitting position in neurosurgical practice. We found that strict adherence to a standardized protocol using TEE monitoring before and during surgery; exclusion of patients with patent foramen ovale; and a combination of positive end expiratory pressure, fluid input, and a standardized position aiming a positive pressure in the transverse and sigmoid sinuses helped to greatly minimize this complication to a rate of 0.5% for hemodynamically relevant events.
Resumo:
Lake-effect snow is an important constraint on ecological and socio-economic systems near the North American Great Lakes. Little is known about the Holocene history of lake-effect snowbelts, and it is difficult to decipher how lake-effect snowfall abundance affected ecosystem development. We conducted oxygen-isotope analysis of calcite in lake-sediment cores from northern Lower Michigan to infer Holocene climatic variation and assess snowbelt development. The two lakes experience the same synoptic-scale climatic systems, but only one of them (Huffman Lake) receives a significant amount of lake-effect snow. A 177-cm difference in annual snowfall causes groundwater inflow at Huffman Lake to be 18O-depleted by 2.3‰ relative to O'Brien Lake. To assess when the lake-effect snowbelt became established, we compared calcite-δ18O profiles of the last 11,500 years from these two sites. The chronologies are based on accelerator-mass-spectrometry 14C ages of 11 and 17 terrestrial-plant samples from Huffman and O'Brien lakes, respectively. The values of δ18O are low at both sites from 11,500 to 9500 cal yr BP when the Laurentide Ice Sheet (LIS) exerted a dominant control over the regional climate and provided periodic pulses of meltwater to the Great Lakes basin. Carbonate δ18O increases by 2.6‰ at O'Brien Lake and by 1.4‰ at Huffman Lake between 9500 and 7000 cal yr BP, suggesting a regional decline in the proportion of runoff derived from winter precipitation. The Great Lakes snowbelt probably developed between 9500 and 5500 cal yr BP as inferred from the progressive 18O-depletion at Huffman Lake relative to O'Brien Lake, with the largest increase of lake-effect snow around 7000 cal yr BP. Lake-effect snow became possible at this time because of increasing contact between the Great Lakes and frigid arctic air. These changes resulted from enhanced westerly flow over the Great Lakes as the LIS collapsed, and from rapidly rising Great Lakes levels during the Nipissing Transgression. The δ18O difference between Huffman and O'Brien lakes declines after 5500 cal yr BP, probably because of a northward shift of the polar vortex that brought increasing winter precipitation to the entire region. However, δ18O remains depleted at Huffman Lake relative to O'Brien Lake because of the continued production of lake-effect snow.
Resumo:
Watershed services are the benefits people obtain from the flow of water through a watershed. While demand for such services is increasing in most parts of the world, supply is getting more insecure due to human impacts on ecosystems such as climate or land use change. Population and water management authorities therefore require information on the potential availability of watershed services in the future and the trade-offs involved. In this study, the Soil and Water Assessment Tool (SWAT) is used to model watershed service availability for future management and climate change scenarios in the East African Pangani Basin. In order to quantify actual “benefits”, SWAT2005 was slightly modified, calibrated and configured at the required spatial and temporal resolution so that simulated water resources and processes could be characterized based on their valuation by stakeholders and their accessibility. The calibrated model was then used to evaluate three management and three climate scenarios. The results show that by the year 2025, not primarily the physical availability of water, but access to water resources and efficiency of use represent the greatest challenges. Water to cover basic human needs is available at least 95% of time but must be made accessible to the population through investments in distribution infrastructure. Concerning the trade-off between agricultural use and hydropower production, there is virtually no potential for an increase in hydropower even if it is given priority. Agriculture will necessarily expand spatially as a result of population growth, and can even benefit from higher irrigation water availability per area unit, given improved irrigation efficiency and enforced regulation to ensure equitable distribution of available water. The decline in services from natural terrestrial ecosystems (e.g. charcoal, food), due to the expansion of agriculture, increases the vulnerability of residents who depend on such services mostly in times of drought. The expected impacts of climate change may contribute to an increase or decrease in watershed service availability, but are only marginal and much lower than management impacts up to the year 2025.
Resumo:
How can we explain the decline in support for the European Union (EU) and the idea of European integration after the onset of the great recession in the fall of 2007? Did the economic crisis and the austerity policies that the EU imposed—in tandem with the IMF—on several member countries help cause this drop? While there is some evidence for this direct effect of EU policies, we find that the most significant determinant of trust and support for the EU remains the level of trust in national governments. Based on cue theory and using concepts of diffuse and specific support, we find that support for the EU is derived from evaluations of national politics and policy, which Europeans know far better than the remote political system of the EU. This effect, however, is somewhat muted for those sophisticated Europeans that are more knowledgeable about the EU and are able to form opinions about it independently of the national contexts in which they live. We also find that the recent economic crisis has led to a discernible increase in the number of those who are disillusioned with politics both at the national and the supranational level. We analyze 133 national surveys from 27 EU countries by estimating a series of cross-classified multilevel logistic regression models.
Resumo:
BACKGROUND Since 2005, increasing numbers of children have started antiretroviral therapy (ART) in sub-Saharan Africa and, in recent years, WHO and country treatment guidelines have recommended ART initiation for all infants and very young children, and at higher CD4 thresholds for older children. We examined temporal changes in patient and regimen characteristics at ART start using data from 12 cohorts in 4 countries participating in the IeDEA-SA collaboration. METHODOLOGY/PRINCIPAL FINDINGS Data from 30,300 ART-naïve children aged <16 years at ART initiation who started therapy between 2005 and 2010 were analysed. We examined changes in median values for continuous variables using the Cuzick's test for trend over time. We also examined changes in the proportions of patients with particular disease severity characteristics (expressed as a binary variable e.g. WHO Stage III/IV vs I/II) using logistic regression. Between 2005 and 2010 the number of children starting ART each year increased and median age declined from 63 months (2006) to 56 months (2010). Both the proportion of children <1 year and ≥10 years of age increased from 12 to 19% and 18 to 22% respectively. Children had less severe disease at ART initiation in later years with significant declines in the percentage with severe immunosuppression (81 to 63%), WHO Stage III/IV disease (75 to 62%), severe anemia (12 to 7%) and weight-for-age z-score<-3 (31 to 28%). Similar results were seen when restricting to infants with significant declines in the proportion with severe immunodeficiency (98 to 82%) and Stage III/IV disease (81 to 63%). First-line regimen use followed country guidelines. CONCLUSIONS/SIGNIFICANCE Between 2005 and 2010 increasing numbers of children have initiated ART with a decline in disease severity at start of therapy. However, even in 2010, a substantial number of infants and children started ART with advanced disease. These results highlight the importance of efforts to improve access to HIV diagnostic testing and ART in children.
Resumo:
At the beginning of the 20th Century, cervical cancer was the leading cause of death from cancer in women. A marked decline in cervical cancer has been observed since the 1960s, in parallel with the introduction of the Papanicolau (Pap) test as a cytological screening method. Today, Pap smear screening is still the most widely used tool for cervical cancer prevention. Testing for human papillomavirus (HPV) in cervical specimens or a combination of Pap and HPV testing are also now available. In this article we compare current guidelines for cervical cancer screening in Switzerland with those in other European countries. In view of the opportunities offered by HPV testing and, since 2008, HPV vaccination, current guidelines for cervical cancer screening should be updated. Both the choice of screening tests and general organization of cervical cancer screening should be reviewed.
Resumo:
OBJECTIVE Visuoperceptual deficits are common in dementia with Lewy bodies (DLB) and Alzheimer disease (AD). Testing visuoperception in dementia is complicated by decline in other cognitive domains and extrapyramidal features. To overcome these issues, we developed a computerized test, the Newcastle visuoperception battery (NEVIP), which is independent of motor function and has minimal cognitive load.We aimed to test its utility to identify visuoperceptual deficits in people with dementia. PARTICIPANTS AND MEASUREMENTS We recruited 28 AD and 26 DLB participants with 35 comparison participants of similar age and education. The NEVIP was used to test angle, color, and form discrimination along with motion perception to obtain a composite visuoperception score. RESULTS Those with DLB performed significantly worse than AD participants on the composite visuoperception score (Mann-Whitney U = 142, p = 0.01). Visuoperceptual deficits (defined as 2 SD below the performance of comparisons) were present in 71% of the DLB group and 40% of the AD group. Performance was not significantly correlated with motor impairment, but was significantly related to global cognitive impairment in DLB (rs = -0.689, p <0.001), but not in AD. CONCLUSION Visuoperceptual deficits can be detected in both DLB and AD participants using the NEVIP, with the DLB group performing significantly worse than AD. Visuoperception scores obtained by the NEVIP are independent of participant motor deficits and participants are able to comprehend and perform the tasks.
Resumo:
Parkinson's disease, typically thought of as a movement disorder, is increasingly recognized as causing cognitive impairment and dementia. Eye movement abnormalities are also described, including impairment of rapid eye movements (saccades) and the fixations interspersed between them. Such movements are under the influence of cortical and subcortical networks commonly targeted by the neurodegeneration seen in Parkinson's disease and, as such, may provide a marker for cognitive decline. This study examined the error rates and visual exploration strategies of subjects with Parkinson's disease, with and without cognitive impairment, whilst performing a battery of visuo-cognitive tasks. Error rates were significantly higher in those Parkinson's disease groups with either mild cognitive impairment (P = 0.001) or dementia (P < 0.001), than in cognitively normal subjects with Parkinson's disease. When compared with cognitively normal subjects with Parkinson's disease, exploration strategy, as measured by a number of eye tracking variables, was least efficient in the dementia group but was also affected in those subjects with Parkinson's disease with mild cognitive impairment. When compared with control subjects and cognitively normal subjects with Parkinson's disease, saccade amplitudes were significantly reduced in the groups with mild cognitive impairment or dementia. Fixation duration was longer in all Parkinson's disease groups compared with healthy control subjects but was longest for cognitively impaired Parkinson's disease groups. The strongest predictor of average fixation duration was disease severity. Analysing only data from the most complex task, with the highest error rates, both cognitive impairment and disease severity contributed to a predictive model for fixation duration [F(2,76) = 12.52, P ≤ 0.001], but medication dose did not (r = 0.18, n = 78, P = 0.098, not significant). This study highlights the potential use of exploration strategy measures as a marker of cognitive decline in Parkinson's disease and reveals the efficiency by which fixations and saccades are deployed in the build-up to a cognitive response, rather than merely focusing on the outcome itself. The prolongation of fixation duration, present to a small but significant degree even in cognitively normal subjects with Parkinson's disease, suggests a disease-specific impact on the networks directing visual exploration, although the study also highlights the multi-factorial nature of changes in exploration and the significant impact of cognitive decline on efficiency of visual search.
Resumo:
Agricultural intensification has caused a decline in structural elements in European farmland, where natural habitats are increasingly fragmented. The loss of habitat structures has a detrimental effect on biodiversity and affects bat species that depend on vegetation structures for foraging and commuting. We investigated the impact of connectivity and configuration of structural landscape elements on flight activity, species richness and diversity of insectivorous bats and distinguished three bat guilds according to species-specific bioacoustic characteristics. We tested whether bats with shorter-range echolocation were more sensitive to habitat fragmentation than bats with longer-range echolocation. We expected to find different connectivity thresholds for the three guilds and hypothesized that bats prefer linear over patchy landscape elements. Bat activity was quantified using repeated acoustic monitoring in 225 locations at 15 study plots distributed across the Swiss Central Plateau, where connectivity and the shape of landscape elements were determined by spatial analysis (GIS). Spectrograms of bat calls were assigned to species with the software batit by means of image recognition and statistical classification algorithms. Bat activity was significantly higher around landscape elements compared to open control areas. Short- and long-range echolocating bats were more active in well-connected landscapes, but optimal connectivity levels differed between the guilds. Species richness increased significantly with connectivity, while species diversity did not (Shannon's diversity index). Total bat activity was unaffected by the shape of landscape elements. Synthesis and applications. This study highlights the importance of connectivity in farmland landscapes for bats, with shorter-range echolocating bats being particularly sensitive to habitat fragmentation. More structurally diverse landscape elements are likely to reduce population declines of bats and could improve conditions for other declining species, including birds. Activity was highest around optimal values of connectivity, which must be evaluated for the different guilds and spatially targeted for a region's habitat configuration. In a multi-species approach, we recommend the reintroduction of structural elements to increase habitat heterogeneity should become part of agri-environment schemes.
Resumo:
Background: Prevalence of hypertension in HIV infection is high, and information on blood pressure control in HIV-infected individuals is insufficient. We modeled blood pressure over time and the risk of cardiovascular events in hypertensive HIV-infected individuals. Methods: All patients from the Swiss HIV Cohort Study with confirmed hypertension (systolic or diastolic blood pressure above 139 or 89 mm Hg on 2 consecutive visits and presence of at least 1 additional cardiovascular risk factor) between April 1, 2000 and March 31, 2011 were included. Patients with previous cardiovascular events, already on antihypertensive drugs, and pregnant women were excluded. Change in blood pressure over time was modeled using linear mixed models with repeated measurement. Results: Hypertension was diagnosed in 2595 of 10,361 eligible patients. Of those, 869 initiated antihypertensive treatment. For patients treated for hypertension, we found a mean (95% confidence interval) decrease in systolic and diastolic blood pressure of −0.82 (−1.06 to −0.58) mm Hg and −0.89 (−1.05 to −0.73) mm Hg/yr, respectively. Factors associated with a decline in systolic blood pressure were baseline blood pressure, presence of chronic kidney disease, cardiovascular events, and the typical risk factors for cardiovascular disease. In patients with hypertension, increase in systolic blood pressure [(hazard ratio 1.18 (1.06 to 1.32) per 10 mm Hg increase], total cholesterol, smoking, age, and cumulative exposure to protease inhibitor–based and triple nucleoside regimens were associated with cardiovascular events. Conclusions: Insufficient control of hypertension was associated with increased risk of cardiovascular events indicating the need for improved management of hypertension in HIV-infected individuals.