941 resultados para STRATA
Resumo:
The impact of cancer on the population of Salvador-Bahia, Brazil was studied using mortality data available from the Brazilian National Bureau of Vital Statistics. Average annual site, age, and gender specific and adjusted cancer mortality rates were determined for the years 1977-83 and contrasted with United States cancer mortality rates for the year of 1977. The accuracy of the cancer mortality rates generated by this research was determined by comparing the underlying causes of death as coded on death certificates to pathology reports and to hospital diagnosis of a sample of 966 deaths occurring in Salvador during the year of 1983. To further explore the information available on the death certificate, a population based decedent case control study was used to determine the relationship between type of occupation (proxy for exposure) and mortality by cancer sites known to be occupationally related.^ The rates in Salvador for cancer of the stomach, oral cavity, and biliary passages are, on average, two fold higher than the U.S. rates.^ The death certificate was found to be accurate for 65 percent of the 485 cancer deaths studied. Thirty five histologically confirmed cancer deaths were found in a random sample of 481 deaths from other causes. This means that, approximately 700 more deaths may be lost among the remainder 10,073 death certificates stating a cause other than cancer.^ In addition, despite the known limitations of decedent case-control studies, cancers of the oral cavity OR = 2.44, CI = 1.17-5.09, stomach OR = 2.31, CI = 1.18-4.52, liver OR = 4.06, CI = 1.27-12.99, bladder OR = 6.77, CI = 1.5-30.67, and lymphoma OR = 2.55, CI = 1.04-6.25 had elevated point estimates, for different age strata indicating an association between these cancers and occupations that led to exposure to petroleum and its derivates. ^
Resumo:
The purpose of this study was to design, implement and evaluate the effectiveness of a date rape prevention program among new students at Rice University. Six-hundred and fifteen new students were randomly assigned to one of eight residential colleges or dormitories. The distribution of students to each of the dormitories was carried out in accordance with a stratified random sampling procedure. The study population was divided into strata based on ethnicity, gender, geographical region, and academic major. The number of students randomly assigned to each of the eight dormitories was approximately 75. After this procedure was completed, each of the colleges was randomly selected to either the intervention or control group. A randomized pretest and posttest control group design was used to assess changes in attitudes, self-efficacy, and behavior with regard to date rape. All participants were given an anonymous pretest and posttest measuring attitudes, self-efficacy, and behavior immediately prior to and following the intervention. The intervention group attended the play Scruples, designed to promote date rape prevention, after which they were immediately posttested. After this initial posttest the intervention group also participated in an interactive group role-playing activity led by trained peer instructors. The control group was pretested and subject to the placebo intervention of a multiculturalism play and was posttested immediately afterwards. Later in the week this group saw the Scruples play only. Both control and intervention groups were sent a two month follow-up survey questionnaire, to measure any changes in attitudes, self-efficacy, and behavior over time. As hypothesized students who saw the play Scruples showed a change in attitudes immediately posttest but no difference in self-efficacy or behavior. The two month follow-up survey showed no change in attitudes, self-efficacy, or behavior. There was a difference at pretest in males and females attitudes, with males showing significantly more rape tolerant attitudes than females. Thus, the proposed research findings will provide a better understanding of the attitudes that perpetuate date rape, and will inform strategies for prevention programs. ^
Resumo:
The proportional distribution of independent malignant tumors in the contralateral breast following treatment for breast cancer was investigated to assess the influence of scattered radiation as a cause of these tumors. In a population of 172 patients the proportion of contralateral tumors in each quadrant and the center (the nipple-areolar complex) was compared with the expected, or natural, distribution found in the general population, in the absence of radiation. The observed/expected ratio for contralateral tumors was 1.43 for the upper-inner quadrant; 0.97, lower-inner quadrant; 1.51, center; 0.76, upper-outer quadrant; and 0.64, lower-outer quadrant. In each quadrant, except the lower-inner, the observed/expected ratio differed from 1.00 with statistical significance at the 5% level (one-tail). The same analysis, stratified by age and menopausal status, showed a similar shift of tumors, with more than expected in the inner quadrants and center and less than expected in the outer quadrants, although the results did not show statistical significance at the 5% level for all strata. For each patient the mean absorbed radiation dose for each quadrant and center of the breast was estimated, based on measurements in a tissue-equivalent phantom. Among patients the doses ranged from 0.5 to 8 Gy; within individuals, doses to the inner quadrants typically were a factor of three times higher than doses to the outer quadrants. The results suggest that radiation may be a risk factor for contralateral breast tumors and warrants further investigation. ^
Resumo:
Sanidine separates from pumice of the early Miocene Peach Springs Tuff are concordantly dated at 18.5 ± 0.2 Ma by two isotopic techniques. The Peach Springs Tuff is the only known unit that can be correlated between isolated outcrops of Miocene strata from the central Mojave Desert of southeastern California to the western Colorado Plateau in Arizona, across five structural provinces, a distance of 350 km. Thus the age of the Peach Springs Tuff is important to structural and paleogeographic reconstructions of a large region. Biotite and sanidine separates from bulk samples of the Peach Springs Tuff from zones of welding and vapor-phase alteration have not produced consistent ages by the K-Ar method. Published ages of mineral separates from 17 localities ranged from 16.2 to 20.5 Ma. Discordant 40Ar/39Ar incremental release spectra were obtained for one biotite and two of the sanidine separates. Ages that correspond to the last gas increments are as old as 27 Ma. The 40Ar/39Ar incremental release determinations on sanidine separated from blocks of Peach Springs Tuff pumice yield ages of 18.3 ± 0.3 and 18.6 ± 0.4 Ma. Laser fusion measurements yield a mean age of 18.51 ± 0.10. The results suggest that sanidine and biotite K-Ar ages older than about 18.5 Ma are due to inherited Ar from pre-Tertiary contaminants, which likely were incorporated into the tuff during deposition. Sanidine K-Ar ages younger than 18 Ma probably indicate incomplete extraction of radiogenic 40Ar, whereas laser fusion dates of biotite and hornblende younger than 18 Ma likely are due to postdepositional alteration. Laser fusion ages as high as 19.01 Ma on biotite grains from pumice suggest that minerals from pre-Tertiary country rocks also were incorporated in the magma chamber.
Resumo:
Englacial horizons deeper than 100 m are absent within 100 MHz ground-penetrating radar (GPR) surface profiles we recorded on Clark and Commonwealth Glaciers in the Antarctic Dry Valleys region. Both glaciers show continuous bottom horizons to 280 m, with bottom signal-to-noise ratios near 30 dB. Density horizons should fade below 50 m depth because impermeable ice occurred by 36 m. Folding within Commonwealth Glacier could preclude radar strata beneath about 80 m depth, but there is no significant folding within Clark Glacier. Strong sulfate concentrations and contrasts exist in our shallow ice core. However, it appears that high background concentration levels, and possible decreased concentration contrasts with depth placed their corresponding reflection coefficients at the limit of, or below, our system sensitivity by about 77 m depth. Further verification of this conclusion awaits processing of our deep-core chemistry profiles.
Resumo:
Unroofing of the Black Mountains, Death Valley, California, has resulted in the exposure of 1.7 Ga crystalline basement, late Precambrian amphibolite facies metasedimentary rocks, and a Tertiary magmatic complex. The Ar-40/Ar-39 cooling ages, obtained from samples collected across the entire length of the range (>55 km), combined with geobarometric results from synextensional intrusions, provide time-depth constraints on the Miocene intrusive history and extensional unroofing of the Black Mountains. Data from the southeastern Black Mountains and adjacent Greenwater Range suggest unroofing from shallow depths between 9 and 10 Ma. To the northwest in the crystalline core of the range, biotite plateau ages from approximately 13 to 6.8 Ma from rocks making up the Death Valley turtlebacks indicate a midcrustal residence (with temperatures >300-degrees-C) prior to extensional unroofing. Biotite Ar-40/Ar-39 ages from both Precambrian basement and Tertiary plutons reveal a diachronous cooling pattern of decreasing ages toward the northwest, subparallel to the regional extension direction. Diachronous cooling was accompanied by dike intrusion which also decreases in age toward the northwest. The cooling age pattern and geobarometric constraints in crystalline rocks of the Black Mountains suggest denudation of 10-15 km along a northwest directed detachment system, consistent with regional reconstructions of Tertiary extension and with unroofing of a northwest deepening crustal section. Mica cooling ages that deviate from the northwest younging trend are consistent with northwestward transport of rocks initially at shallower crustal levels onto deeper levels along splays of the detachment. The well-known Amargosa chaos and perhaps the Badwater turtleback are examples of this "splaying" process. Considering the current distance of the structurally deepest samples away from moderately to steeply east tilted Tertiary strata in the southeastern Black Mountains, these data indicate an average initial dip of the detachment system of the order of 20-degrees, similar to that determined for detachment faults in west central Arizona and southeastern California. Beginning with an initially listric geometry, a pattern of footwall unroofing accompanied by dike intrusion progress northwestward. This pattern may be explained by a model where migration of footwall flexures occur below a scoop-shaped banging wall block. One consequence of this model is that gently dipping ductile fabrics developed in the middle crust steepen in the upper crust during unloading. This process resolves the low initial dips obtained here with mapping which suggests transport of the upper plate on moderately to steeply dipping surfaces in the middle and upper crust.
Resumo:
In the Mt. Olympos region of northeastern Greece, continental margin strata and basement rocks were subducted and metamorphosed under blueschist facies conditions, and thrust over carbonate platform strata during Alpine orogenesis. Subsequent exposure of the subducted basement rocks by normal faulting has allowed an integrated study of the timing of metamorphism, its relationship to deformation, and the thermal history of the subducted terrane. Alpine low-grade metamorphic assemblages occur at four structural levels. Three thrust sheets composed of Paleozoic granitic basement and Mesozoic metasedimentary cover were thrust over Mesozoic carbonate rocks and Eocene flysch; thrusting and metamorphism occurred first in the highest thrust sheets and progressed downward as units were imbricated from NE to SW. 40Ar/39Ar spectra from hornblende, white mica, and biotite samples indicate that the upper two units preserve evidence of four distinct thermal events: (1) 293–302 Ma crystallization of granites, with cooling from >550°C to <325°C by 284 Ma; (2) 98–100 Ma greenschist to blueschist-greenschist transition facies metamorphism (T∼350–500°C) and imbrication of continental thrust sheets; (3) 53–61 Ma blueschist facies metamorphism and deformation of the basement and continental margin units at T<350–400°C; (4) 36–40 Ma thrusting of blueschists over the carbonate platform, and metamorphism at T∼200–350°C. Only the Eocene and younger events affected the lower two structural packages. A fifth event, indicated by diffusive loss profiles in microcline spectra, reflects the beginning of uplift and cooling to T<100–150°C at 16–23 Ma, associated with normal faulting which continued until Quaternary time. Incomplete resetting of mica ages in all units constrains the temperature of metamorphism during continental subduction to T≤350°C, the closure temperature for Ar in muscovite. The diffusive loss profiles in micas and K-feldspars enable us to “see through” the younger events to older events in the high-T parts of the release spectra. Micas grown during earlier metamorphic events lost relatively small amounts of Ar during subsequent high pressure-low temperature metamorphism. Release spectra from phengites grown during Eocene metamorphism and deformation record the ages of the Ar-loss events. Alpine deformation in northern Greece occurred over a long time span (∼90 Ma), and involved subduction and episodic imbrication of continental basement before, during, and after the collision of the Apulian and Eurasian plates. Syn-subduction uplift and cooling probably combined with intermittently higher cooling rates during extensional events to preserve the blueschist facies mineral assemblages as they were exhumed from depths of >20 km. Extension in the Olympos region was synchronous with extension in the Mesohellenic trough and the Aegean back-arc, and concurrent with westward-progressing shortening in the external Hellenides.
Resumo:
OBJECTIVE We aimed to create an index to stratify cryptogenic stroke (CS) patients with patent foramen ovale (PFO) by their likelihood that the stroke was related to their PFO. METHODS Using data from 12 component studies, we used generalized linear mixed models to predict the presence of PFO among patients with CS, and derive a simple index to stratify patients with CS. We estimated the stratum-specific PFO-attributable fraction and stratum-specific stroke/TIA recurrence rates. RESULTS Variables associated with a PFO in CS patients included younger age, the presence of a cortical stroke on neuroimaging, and the absence of these factors: diabetes, hypertension, smoking, and prior stroke or TIA. The 10-point Risk of Paradoxical Embolism score is calculated from these variables so that the youngest patients with superficial strokes and without vascular risk factors have the highest score. PFO prevalence increased from 23% (95% confidence interval [CI]: 19%-26%) in those with 0 to 3 points to 73% (95% CI: 66%-79%) in those with 9 or 10 points, corresponding to attributable fraction estimates of approximately 0% to 90%. Kaplan-Meier estimated stroke/TIA 2-year recurrence rates decreased from 20% (95% CI: 12%-28%) in the lowest Risk of Paradoxical Embolism score stratum to 2% (95% CI: 0%-4%) in the highest. CONCLUSION Clinical characteristics identify CS patients who vary markedly in PFO prevalence, reflecting clinically important variation in the probability that a discovered PFO is likely to be stroke-related vs incidental. Patients in strata more likely to have stroke-related PFOs have lower recurrence risk.
Resumo:
Forest management not only affects biodiversity but also might alter ecosystem processes mediated by the organisms, i.e. herbivory the removal of plant biomass by plant-eating insects and other arthropod groups. Aiming at revealing general relationships between forest management and herbivory we investigated aboveground arthropod herbivory in 105 plots dominated by European beech in three different regions in Germany in the sun-exposed canopy of mature beech trees and on beech saplings in the understorey. We separately assessed damage by different guilds of herbivores, i.e. chewing, sucking and scraping herbivores, gall-forming insects and mites, and leaf-mining insects. We asked whether herbivory differs among different forest management regimes (unmanaged, uneven-aged managed, even-aged managed) and among age-classes within even-aged forests. We further tested for consistency of relationships between regions, strata and herbivore guilds. On average, almost 80 of beech leaves showed herbivory damage, and about 6 of leaf area was consumed. Chewing damage was most common, whereas leaf sucking and scraping damage were very rare. Damage was generally greater in the canopy than in the understorey, in particular for chewing and scraping damage, and the occurrence of mines. There was little difference in herbivory among differently managed forests and the effects of management on damage differed among regions, strata and damage types. Covariates such as wood volume, tree density and plant diversity weakly influenced herbivory, and effects differed between herbivory types. We conclude that despite of the relatively low number of species attacking beech; arthropod herbivory on beech is generally high. We further conclude that responses of herbivory to forest management are multifaceted and environmental factors such as forest structure variables affecting in particular microclimatic conditions are more likely to explain the variability in herbivory among beech forest plots.
Resumo:
The protection and sustainable management of forest carbon stocks, particularly in the tropics, is a key factor in the mitigation of global change effects. However, our knowledge of how land use and elevation affect carbon stocks in tropical ecosystems is very limited. We compared aboveground biomass of trees, shrubs and herbs for eleven natural and human-influenced habitat types occurring over a wide elevation gradient (866–4550 m) at the world's highest solitary mountain, Mount Kilimanjaro. Thanks to the enormous elevation gradient, we covered important natural habitat types, e.g., savanna woodlands, montane rainforest and afro-alpine vegetation, as well as important land-use types such as maize fields, grasslands, traditional home gardens, coffee plantations and selectively logged forest. To assess tree and shrub biomass with pantropical allometric equations, we measured tree height, diameter at breast height and wood density and to assess herbaceous biomass, we sampled destructively. Among natural habitats, tree biomass was highest at intermediate elevation in the montane zone (340 Mg ha−1), shrub biomass declined linearly from 7 Mg ha−1 at 900 m to zero above 4000 m, and, inverse to tree biomass, herbaceous biomass was lower at mid-elevations (1 Mg ha−1) than in savannas (900 m, 3 Mg ha−1) or alpine vegetation (above 4000 m, 6 Mg ha−1). While the various land-use types dramatically decreased woody biomass at all elevations, though to various degrees, herbaceous biomass was typically increased. Our study highlights tropical montane forest biomass as important aboveground carbon stock and quantifies the extent of the strong aboveground biomass reductions by the major land-use types, common to East Africa. Further, it shows that elevation and land use differently affect different vegetation strata, and thus the matrix for other organisms.
Resumo:
Environmental aspects are increasingly being integrated in Negev Bedouin studies by both, NGO activists and scholars. We will present these recent works and discuss new concepts and methodologies of environmental studies with potential relevance in the field of Negev Bedouin studies. We will then identify research areas where environmental and development approaches converge or diverge with mainstream social sciences on this specific field of research. While most of the Bedouin population in southern Israel lives in urban centers in the Northern Negev, a large part of Bedouin people live in unrecognized clusters of houses in remote areas. Extensive livestock rearing is an important source of livelihood at least for non-urbanized Bedouin, the latter forming the lowest economic strata of the Israeli spectrum of incomes. Numerous stressors affect this Bedouin community enduring uncertain livelihood and access to land. The erratic precipitations from year to year and long-term changes in precipitation trends are a source of great uncertainty. With a significant price increase for feeding supplements to compensate for dry years, livestock rearing has become a harsher source of livelihood. Land scarcity for grazing adds to the difficulty in ensuring enough income for living. Studies in the last 15 years have described several livelihood strategies based on a livestock rearing semi-nomadic economy in the Negev. A number of other analyses have shown how Bedouin herders and governmental agencies have found agreements at the advantage of both, the agencies and the herders. New concepts such as transformability, resilience and adaptation strategies are important tools to analyze the capacity of vulnerable communities to cope with an ever increasing livelihood uncertainty. Such research concepts can assist in better understanding how Bedouin herders in the Negev may adapt to climate and political risks.
Resumo:
While the system stabilizing function of reciprocity is widely acknowledged, much less attention has been paid to the argument that reciprocity might initiate social cooperation in the first place. This paper tests Gouldner’s early assumption that reciprocity may act as a ‘starting mechanism’ of social cooperation in consolidating societies. The empirical test scenario builds on unequal civic engagement between immigrants and nationals, as this engagement gap can be read as a lack of social cooperation in consolidating immigration societies. Empirical analyses using survey data on reciprocal norms and based on Bayesian hierarchical modelling lend support for Gouldner’s thesis, underlining thereby the relevance of reciprocity in today’s increasingly diverse societies: individual norms of altruistic reciprocity elevate immigrants’ propensity to volunteer, reducing thereby the engagement gap between immigrants and natives in the area of informal volunteering. In other words, compliance with altruistic reciprocity may trigger cooperation in social strata, where it is less likely to occur. The positive moderation of the informal engagement gap through altruistic reciprocity turns out to be most pronounced for immigrants who are least likely to engage in informal volunteering, meaning low, but also high educated immigrants.
Resumo:
Background. Cryptococcal meningitis is a leading cause of death in people living with human immunodeficiency virus (HIV)/acquired immune deficiency syndrome. The World Health Organizations recommends pre-antiretroviral treatment (ART) cryptococcal antigen (CRAG) screening in persons with CD4 below 100 cells/µL. We assessed the prevalence and outcome of cryptococcal antigenemia in rural southern Tanzania. Methods. We conducted a retrospective study including all ART-naive adults with CD4 <150 cells/µL prospectively enrolled in the Kilombero and Ulanga Antiretroviral Cohort between 2008 and 2012. Cryptococcal antigen was assessed in cryopreserved pre-ART plasma. Cox regression estimated the composite outcome of death or loss to follow-up (LFU) by CRAG status and fluconazole use. Results. Of 750 ART-naive adults, 28 (3.7%) were CRAG-positive, corresponding to a prevalence of 4.4% (23 of 520) in CD4 <100 and 2.2% (5 of 230) in CD4 100-150 cells/µL. Within 1 year, 75% (21 of 28) of CRAG-positive and 42% (302 of 722) of CRAG-negative patients were dead or LFU (P<.001), with no differences across CD4 strata. Cryptococcal antigen positivity was an independent predictor of death or LFU after adjusting for relevant confounders (hazard ratio [HR], 2.50; 95% confidence interval [CI], 1.29-4.83; P = .006). Cryptococcal meningitis occurred in 39% (11 of 28) of CRAG-positive patients, with similar retention-in-care regardless of meningitis diagnosis (P = .8). Cryptococcal antigen titer >1:160 was associated with meningitis development (odds ratio, 4.83; 95% CI, 1.24-8.41; P = .008). Fluconazole receipt decreased death or LFU in CRAG-positive patients (HR, 0.18; 95% CI, .04-.78; P = .022). Conclusions. Cryptococcal antigenemia predicted mortality or LFU among ART-naive HIV-infected persons with CD4 <150 cells/µL, and fluconazole increased survival or retention-in-care, suggesting that targeted pre-ART CRAG screening may decrease early mortality or LFU. A CRAG screening threshold of CD4 <100 cells/µL missed 18% of CRAG-positive patients, suggesting guidelines should consider a higher threshold.
Resumo:
BACKGROUND Estimates of the size of the undiagnosed HIV-infected population are important to understand the HIV epidemic and to plan interventions, including "test-and-treat" strategies. METHODS We developed a multi-state back-calculation model to estimate HIV incidence, time between infection and diagnosis, and the undiagnosed population by CD4 count strata, using surveillance data on new HIV and AIDS diagnoses. The HIV incidence curve was modelled using cubic splines. The model was tested on simulated data and applied to surveillance data on men who have sex with men in The Netherlands. RESULTS The number of HIV infections could be estimated accurately using simulated data, with most values within the 95% confidence intervals of model predictions. When applying the model to Dutch surveillance data, 15,400 (95% confidence interval [CI] = 15,000, 16,000) men who have sex with men were estimated to have been infected between 1980 and 2011. HIV incidence showed a bimodal distribution, with peaks around 1985 and 2005 and a decline in recent years. Mean time to diagnosis was 6.1 (95% CI = 5.8, 6.4) years between 1984 and 1995 and decreased to 2.6 (2.3, 3.0) years in 2011. By the end of 2011, 11,500 (11,000, 12,000) men who have sex with men in The Netherlands were estimated to be living with HIV, of whom 1,750 (1,450, 2,200) were still undiagnosed. Of the undiagnosed men who have sex with men, 29% (22, 37) were infected for less than 1 year, and 16% (13, 20) for more than 5 years. CONCLUSIONS This multi-state back-calculation model will be useful to estimate HIV incidence, time to diagnosis, and the undiagnosed HIV epidemic based on routine surveillance data.
Resumo:
Treatment allocation by epidermal growth factor receptor mutation status is a new standard in patients with metastatic nonesmall-cell lung cancer. Yet, relatively few modern chemotherapy trials were conducted in patients characterized by epidermal growth factor receptor wild type. We describe the results of a multicenter phase II trial, testing in parallel 2 novel combination therapies, predefined molecular markers, and tumor rebiopsy at progression. Objective: The goal was to demonstrate that tailored therapy, according to tumor histology and epidermal growth factor receptor (EGFR) mutation status, and the introduction of novel drug combinations in the treatment of advanced nonesmall-cell lung cancer are promising for further investigation. Methods: We conducted a multicenter phase II trial with mandatory EGFR testing and 2 strata. Patients with EGFR wild type received 4 cycles of bevacizumab, pemetrexed, and cisplatin, followed by maintenance with bevacizumab and pemetrexed until progression. Patients with EGFR mutations received bevacizumab and erlotinib until progression. Patients had computed tomography scans every 6 weeks and repeat biopsy at progression. The primary end point was progression-free survival (PFS) ≥ 35% at 6 months in stratum EGFR wild type; 77 patients were required to reach a power of 90% with an alpha of 5%. Secondary end points were median PFS, overall survival, best overall response rate (ORR), and tolerability. Further biomarkers and biopsy at progression were also evaluated. Results: A total of 77 evaluable patients with EGFR wild type received an average of 9 cycles (range, 1-25). PFS at 6 months was 45.5%, median PFS was 6.9 months, overall survival was 12.1 months, and ORR was 62%. Kirsten rat sarcoma oncogene mutations and circulating vascular endothelial growth factor negatively correlated with survival, but thymidylate synthase expression did not. A total of 20 patients with EGFR mutations received an average of 16.