957 resultados para J11 - Demographic Trends and Forecasts


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose of this research is to assess public values and perceptions concerning industrial heritage in the Keweenaw by studying visitors at an endangered mining site tour. This research presents and analyzes feedback collected directly from participants in the Cliff Mine (Michigan) archaeological field school public tour surveys in June 2011, gathers semi-structured interview data from survey participants and local experts, and synthesizes and collates both survey and interview data. As those who study heritage site visitors have found, in all outreach there is a necessity for deeper understanding of visitors for the outreach to be effective. An appropriate metric for collecting public values and opinions was created and used at the Cliff Mine archaeological field school public tours. To accomplish research goals, an opinion survey was created to collect demographic information and qualitative feedback from visitors at the Cliff Mine field school. The survey, a pre-tour and post-tour question list, found that all visitors who filled out a survey supported preservation and most were adults over 46 years of age. Most visitors were white-collar professionals, identified as local residents, and found out about the tour through the newspaper. Interview questions were constructed to supplement and expand on the visitor survey results. In addition, local experts involved in Keweenaw heritage were interviewed. All interviewees supported heritage preservation but often had conflicting views when activities such as mineral collecting were factored into the preservation question. By analyzing responses to the survey and interviews, improvements to outreach efforts at the Cliff Mine are recommended. Future research should further explore perceptions of social class and identity, and should seek out stakeholders not contacted through this research, in order to improve outreach and include all community groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Activity of clotting factor VIII has been shown to acutely increase with sympathetic nervous system stimulation. We investigated whether aspirin and propranolol affect the responsiveness of plasma clotting factor VIII activity levels to acute psychosocial stress. We randomized 54 healthy subjects double-blind to 5-day treatment with a single daily oral dosage of either 100 mg aspirin plus 80 mg propranolol combined, 100 mg of aspirin, 80 mg of propranolol, or placebo medication. Thereafter, subjects underwent a 13-min standardized psychosocial stressor. Plasma levels of clotting factor VIII activity were determined immediately before, immediately after, 45 min and 105 min after stress. Controlling for demographic, metabolic, and life style factors repeated measures analysis of covariance showed that the change in clotting factor VIII activity from prestress to 105 min poststress differed between medication groups (P = 0.023; partial eta = 0.132). The clotting factor VIII activity level decreased from prestress to immediately poststress in the aspirin/propranolol group relative to the placebo group (P = 0.048) and the aspirin group (P < 0.06). Between 45 min and 105 min poststress, clotting factor VIII levels increased in the aspirin/propranolol group relative to the placebo group (P = 0.007) and the aspirin group (P = 0.039). The stress response in clotting factor VIII activity levels was not significantly different between the aspirin/propranolol group and the propranolol group. Propranolol in combination with aspirin diminished the acute response in clotting factor VIII activity to psychosocial stress compared with placebo medication and aspirin alone. The effect of single aspirin on the acute clotting factor VIII stress response was indistinguishable from a placebo effect.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Hypereosinophilic syndrome (HES) is a heterogeneous group of rare disorders defined by persistent blood eosinophilia > or =1.5 x 10(9)/L, absence of a secondary cause, and evidence of eosinophil-associated pathology. With the exception of a recent multicenter trial of mepolizumab (anti-IL-5 mAb), published therapeutic experience has been restricted to case reports and small case series. OBJECTIVE: The purpose of the study was to collect and summarize baseline demographic, clinical, and laboratory characteristics in a large, diverse cohort of patients with HES and to review responses to treatment with conventional and novel therapies. METHODS: Clinical and laboratory data from 188 patients with HES, seen between January 2001 and December 2006 at 11 institutions in the United States and Europe, were collected retrospectively by chart review. RESULTS: Eighteen of 161 patients (11%) tested were Fip1-like 1-platelet-derived growth factor receptor alpha (FIP1L1-PDGFRA) mutation-positive, and 29 of 168 patients tested (17%) had a demonstrable aberrant or clonal T-cell population. Corticosteroid monotherapy induced complete or partial responses at 1 month in 85% (120/141) of patients with most remaining on maintenance doses (median, 10 mg prednisone equivalent daily for 2 months to 20 years). Hydroxyurea and IFN-alpha (used in 64 and 46 patients, respectively) were also effective, but their use was limited by toxicity. Imatinib (used in 68 patients) was more effective in patients with the FIP1L1-PDGFRA mutation (88%) than in those without (23%; P < .001). CONCLUSION: This study, the largest clinical analysis of patients with HES to date, not only provides useful information for clinicians but also should stimulate prospective trials to optimize treatment of HES.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Elevated plasma fibrinogen levels have prospectively been associated with an increased risk of coronary artery disease in different populations. Plasma fibrinogen is a measure of systemic inflammation crucially involved in atherosclerosis. The vagus nerve curtails inflammation via a cholinergic antiinflammatory pathway. We hypothesized that lower vagal control of the heart relates to higher plasma fibrinogen levels. METHODS: Study participants were 559 employees (age 17-63 years; 89% men) of an airplane manufacturing plant in southern Germany. All subjects underwent medical examination, blood sampling, and 24-hour ambulatory heart rate recording while kept on their work routine. The root mean square of successive differences in RR intervals during the night period (nighttime RMSSD) was computed as the heart rate variability index of vagal function. RESULTS: After controlling for demographic, lifestyle, and medical factors, nighttime RMSSD explained 1.7% (P = 0.001), 0.8% (P = 0.033), and 7.8% (P = 0.007), respectively, of the variance in fibrinogen levels in all subjects, men, and women. Nighttime RMSSD and fibrinogen levels were stronger correlated in women than in men. In all workers, men, and women, respectively, there was a mean +/- SEM increase of 0.41 +/- 0.13 mg/dL, 0.28 +/- 0.13 mg/dL, and 1.16 +/- 0.41 mg/dL fibrinogen for each millisecond decrease in nighttime RMSSD. CONCLUSIONS: Reduced vagal outflow to the heart correlated with elevated plasma fibrinogen levels independent of the established cardiovascular risk factors. This relationship seemed comparably stronger in women than men. Such an autonomic mechanism might contribute to the atherosclerotic process and its thrombotic complications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Famines are often linked to drought in semi-arid areas of Sub-Saharan Africa where not only pastoralists, but also increasingly agro-pastoralists are affected. This study addresses the interplay between drought and famine in the rural semi-arid areas of Makueni district, Kenya, by examining whether, and how crop production conditions and agro-pastoral strategies predispose smallholder households to drought-triggered food insecurity. If this hypothesis holds, then approaches to deal with drought and famine have to target factors causing household food insecurity during non-drought periods. Data from a longitudinal survey of 127 households, interviews, workshops, and daily rainfall records (1961–2003) were analysed using quantitative and qualitative methods. This integrated approach confirms the above hypothesis and reveals that factors other than rainfall, like asset and labour constraints, inadequate policy enforcement, as well as the poverty-driven inability to adopt risk-averse production systems play a key role. When linking these factors to the high rainfall variability, farmer-relevant definitions and forecasts of drought have to be applied.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Three-dimensional (3D) immersive virtual worlds have been touted as being capable of facilitating highly interactive, engaging, multimodal learning experiences. Much of the evidence gathered to support these claims has been anecdotal but the potential that these environments hold to solve traditional problems in online and technology-mediated education—primarily learner isolation and student disengagement—has resulted in considerable investments in virtual world platforms like Second Life, OpenSimulator, and Open Wonderland by both professors and institutions. To justify this ongoing and sustained investment, institutions and proponents of simulated learning environments must assemble a robust body of evidence that illustrates the most effective use of this powerful learning tool. In this authoritative collection, a team of international experts outline the emerging trends and developments in the use of 3D virtual worlds for teaching and learning. They explore aspects of learner interaction with virtual worlds, such as user wayfinding in Second Life, communication modes and perceived presence, and accessibility issues for elderly or disabled learners. They also examine advanced technologies that hold potential for the enhancement of learner immersion and discuss best practices in the design and implementation of virtual world-based learning interventions and tasks. By evaluating and documenting different methods, approaches, and strategies, the contributors to Learning in Virtual Worlds offer important information and insight to both scholars and practitioners in the field. AU Press is an open access publisher and the book is available for free in PDF format as well as for purchase on our website: http://bit.ly/1W4yTRA

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many Member States of the European Union (EU) currently monitor antimicrobial resistance in zoonotic agents, including Salmonella and Campylobacter. According to Directive 2003/99/EC, Member States shall ensure that the monitoring provides comparable data on the occurrence of antimicrobial resistance. The European Commission asked the European Food Safety Authority to prepare detailed specifications for harmonised schemes for monitoring antimicrobial resistance. The objective of these specifications is to lay down provisions for a monitoring and reporting scheme for Salmonella in fowl (Gallus gallus), turkeys and pigs, and for Campylobacter jejuni and Campylobacter coli in broiler chickens. The current specifications are considered to be a first step towards a gradual implementation of comprehensive antimicrobial resistance monitoring at the EU level. These specifications propose to test a common set of antimicrobial agents against available cut-off values and a specified concentration range to determine the susceptibility of Salmonella and Campylobacter. Using isolates collected through programmes in which the sampling frame covers all epidemiological units of the national production, the target number of Salmonella isolates to be included in the antimicrobial resistance monitoring per Member State per year is 170 for each study population (i.e., laying hens, broilers, turkeys and slaughter pigs). The target number of Campylobacter isolates to be included in the antimicrobial resistance monitoring per Member State per year is 170 for each study population (i.e., broilers). The results of the antimicrobial resistance monitoring are assessed and reported in the yearly national report on trends and sources of zoonoses, zoonotic agents and antimicrobial resistance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To assess the prevalence of tooth wear on buccal/facial and lingual/palatal tooth surfaces and identify related risk factors in a sample of young European adults, aged 18-35 years. Calibrated and trained examiners measured tooth wear, using the basic erosive wear examination (BEWE) on in 3187 patients in seven European countries and assessed the impact of risk factors with a previously validated questionnaire. Each individual was characterized by the highest BEWE score recorded for any scoreable surface. Bivariate analyses examined the proportion of participants who scored 2 or 3 in relation to a range of demographic, dietary and oral care variables. The highest tooth wear BEWE score was 0 for 1368 patients (42.9%), 1 for 883 (27.7%), 2 for 831 (26.1%) and 3 for 105 (3.3%). There were large differences between different countries with the highest levels of tooth wear observed in the UK. Important risk factors for tooth wear included heartburn or acid reflux, repeated vomiting, residence in rural areas, electric tooth brushing and snoring. We found no evidence that waiting after breakfast before tooth brushing has any effect on the degree of tooth wear (p=0.088). Fresh fruit and juice intake was positively associated with tooth wear. In this adult sample 29% had signs of tooth wear making it a common presenting feature in European adults.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND The electrocardiographic PR interval increases with aging, differs by race, and is associated with atrial fibrillation (AF), pacemaker implantation, and all-cause mortality. We sought to determine the associations between PR interval and heart failure, AF, and mortality in a biracial cohort of older adults. METHODS AND RESULTS The Health, Aging, and Body Composition (Health ABC) Study is a prospective, biracial cohort. We used multivariable Cox proportional hazards models to examine PR interval (hazard ratios expressed per SD increase) and 10-year risks of heart failure, AF, and all-cause mortality. Multivariable models included demographic, anthropometric, and clinical variables in addition to established cardiovascular risk factors. We examined 2722 Health ABC participants (aged 74±3 years, 51.9% women, and 41% black). We did not identify significant effect modification by race for the outcomes studied. After multivariable adjustment, every SD increase (29 ms) in PR interval was associated with a 13% greater 10-year risk of heart failure (95% confidence interval, 1.02-1.25) and a 13% increased risk of incident AF (95% confidence interval, 1.04-1.23). PR interval >200 ms was associated with a 46% increased risk of incident heart failure (95% confidence interval, 1.11-1.93). PR interval was not associated with increased all-cause mortality. CONCLUSIONS We identified significant relationships of PR interval to heart failure and AF in older adults. Our findings extend prior investigations by examining PR interval and associations with adverse outcomes in a biracial cohort of older men and women.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Speciation is a fundamental evolutionary process, the knowledge of which is crucial for understanding the origins of biodiversity. Genomic approaches are an increasingly important aspect of this research field. We review current understanding of genome-wide effects of accumulating reproductive isolation and of genomic properties that influence the process of speciation. Building on this work, we identify emergent trends and gaps in our understanding, propose new approaches to more fully integrate genomics into speciation research, translate speciation theory into hypotheses that are testable using genomic tools and provide an integrative definition of the field of speciation genomics

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Renal involvement is a serious manifestation of systemic lupus erythematosus (SLE); it may portend a poor prognosis as it may lead to end-stage renal disease (ESRD). The purpose of this study was to determine the factors predicting the development of renal involvement and its progression to ESRD in a multi-ethnic SLE cohort (PROFILE). METHODS AND FINDINGS: PROFILE includes SLE patients from five different United States institutions. We examined at baseline the socioeconomic-demographic, clinical, and genetic variables associated with the development of renal involvement and its progression to ESRD by univariable and multivariable Cox proportional hazards regression analyses. Analyses of onset of renal involvement included only patients with renal involvement after SLE diagnosis (n = 229). Analyses of ESRD included all patients, regardless of whether renal involvement occurred before, at, or after SLE diagnosis (34 of 438 patients). In addition, we performed a multivariable logistic regression analysis of the variables associated with the development of renal involvement at any time during the course of SLE.In the time-dependent multivariable analysis, patients developing renal involvement were more likely to have more American College of Rheumatology criteria for SLE, and to be younger, hypertensive, and of African-American or Hispanic (from Texas) ethnicity. Alternative regression models were consistent with these results. In addition to greater accrued disease damage (renal damage excluded), younger age, and Hispanic ethnicity (from Texas), homozygosity for the valine allele of FcgammaRIIIa (FCGR3A*GG) was a significant predictor of ESRD. Results from the multivariable logistic regression model that included all cases of renal involvement were consistent with those from the Cox model. CONCLUSIONS: Fcgamma receptor genotype is a risk factor for progression of renal disease to ESRD. Since the frequency distribution of FCGR3A alleles does not vary significantly among the ethnic groups studied, the additional factors underlying the ethnic disparities in renal disease progression remain to be elucidated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recently it has been proposed that the evaluation of effects of pollutants on aquatic organisms can provide an early warning system of potential environmental and human health risks (NRC 1991). Unfortunately there are few methods available to aquatic biologists to conduct assessments of the effects of pollutants on aquatic animal community health. The primary goal of this research was to develop and evaluate the feasibility of such a method. Specifically, the primary objective of this study was to develop a prototype rapid bioassessment technique similar to the Index of Biotic Integrity (IBI) for the upper Texas and Northwestern Gulf of Mexico coastal tributaries. The IBI consists of a series of "metrics" which describes specific attributes of the aquatic community. Each of these metrics are given a score which is then subtotaled to derive a total assessment of the "health" of the aquatic community. This IBI procedure may provide an additional assessment tool for professionals in water quality management.^ The experimental design consisted primarily of compiling previously collected data from monitoring conducted by the Texas Natural Resource Conservation Commission (TNRCC) at five bayous classified according to potential for anthropogenic impact and salinity regime. Standardized hydrological, chemical, and biological monitoring had been conducted in each of these watersheds. The identification and evaluation of candidate metrics for inclusion in the estuarine IBI was conducted through the use of correlation analysis, cluster analysis, stepwise and normal discriminant analysis, and evaluation of cumulative distribution frequencies. Scores of each included metric were determined based on exceedances of specific percentiles. Individual scores were summed and a total IBI score and rank for the community computed.^ Results of these analyses yielded the proposed metrics and rankings listed in this report. Based on the results of this study, incorporation of an estuarine IBI method as a water quality assessment tool is warranted. Adopted metrics were correlated to seasonal trends and less so to salinity gradients observed during the study (0-25 ppt). Further refinement of this method is needed using a larger more inclusive data set which includes additional habitat types, salinity ranges, and temporal variation. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Limited research has been conducted evaluating programs that are designed to improve the outcomes of homeless adults with mental disorders and comorbid alcohol, drug and mental disorders. This study conducted such an evaluation in a community-based day treatment setting with clients of the Harris County Mental Health and Mental Retardation Authority's Bristow Clinic. The study population included all clients who received treatment at the clinic for a minimum of six months between January 1, 1995 and August 31, 1996. An electronic database was used to identify clients and to track their program involvement. A profile was developed of the study participants and their level of program involvement included an examination of the amount of time spent in clinical, social and other interventions, the type of interventions encountered and the number of interventions encountered. Results were analyzed to determine whether social, demographic and mental history affected levels of program involvement and the effects of the levels of program involvement on housing status and psychiatric functioning status.^ A total of 101 clients met the inclusion criteria. Of the 101 clients, 96 had a mental disorder, and five had comorbidity. Due to the limited numbers of participants with comorbidity, only those with mental disorders were included in the analysis. The study found the Bristow Clinic population to be primarily single, Black, male, between the ages of 31 and 40 years, and with a gross family income of less than $4,000. There were more persons residing on the streets at entry and at six months following treatment than in any other residential setting. The most prevalent psychiatric diagnoses were depressive disorders and schizophrenia. The Global Assessment of Functioning (GAF) scale which was used to determine the degree of psychiatric functioning revealed a modal GAF score of 31--40 at entry and following six months in treatment. The study found that the majority of clients spent less than 17 hours in treatment, had less than 51 encounters and had clinical, social, and other encounters. In regard to social and demographic factors and levels of program involvement, there were statistically significant associations between gender and ethnicity and the types of interventions encountered as well as the number of interventions encountered. There was also a statistically significant difference between the amount of time spent in clinical interventions and gender. Relative to outcomes measured, the study found female gender to be the only background variable that was significantly associated with improved housing status and the female gender and previous MHMRA involvement to be statistically associated with improvement in GAF score. The total time in other (not clinical or social) interventions and the total number of encounters with other interventions were also significantly associated with improvement in housing outcome. The analysis of previous services and levels of program involvement revealed significant associations between time spent in social and clinical interventions and previous hospitalizations and previous MHMRA involvement.^ Major limitations of this study include the small sample size which may have resulted in very little power to detect differences and the lack of generalizability of findings due to site locations used in the study. Despite these limitations, the study makes an important contribution to the literature by documenting the levels of program involvement and the social and demographic factors necessary to produce outcomes of improved housing status and psychiatric functioning status. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Rainfall controls fire in tropical savanna ecosystems through impacting both the amount and flammability of plant biomass, and consequently, predicted changes in tropical precipitation over the next century are likely to have contrasting effects on the fire regimes of wet and dry savannas. We reconstructed the long-term dynamics of biomass burning in equatorial East Africa, using fossil charcoal particles from two well-dated lake-sediment records in western Uganda and central Kenya. We compared these high-resolution (5 years/sample) time series of biomass burning, spanning the last 3800 and 1200 years, with independent data on past hydroclimatic variability and vegetation dynamics. In western Uganda, a rapid (<100 years) and permanent increase in burning occurred around 2170 years ago, when climatic drying replaced semideciduous forest by wooded grassland. At the century time scale, biomass burning was inversely related to moisture balance for much of the next two millennia until ca. 1750 ad, when burning increased strongly despite regional climate becoming wetter. A sustained decrease in burning since the mid20th century reflects the intensified modern-day landscape conversion into cropland and plantations. In contrast, in semiarid central Kenya, biomass burning peaked at intermediate moisture-balance levels, whereas it was lower both during the wettest and driest multidecadal periods of the last 1200 years. Here, burning steadily increased since the mid20th century, presumably due to more frequent deliberate ignitions for bush clearing and cattle ranching. Both the observed historical trends and regional contrasts in biomass burning are consistent with spatial variability in fire regimes across the African savanna biome today. They demonstrate the strong dependence of East African fire regimes on both climatic moisture balance and vegetation, and the extent to which this dependence is now being overridden by anthropogenic activity.