962 resultados para Geological and Natural History Survey of Minnesota
Resumo:
Aims: Angiographic evidence of edge dissections has been associated with a risk of early stent thrombosis. Optical coherence tomography (OCT) is a high-resolution technology detecting a greater number of edge dissections -particularly non-flow-limiting- compared to angiography. Their natural history and clinical implications remain unclear. The objectives of the present study were to assess the morphology, healing response, and clinical outcomes of OCT-detected edge dissections using serial OCT imaging at baseline and at one year following drug-eluting stent (DES) implantation. Methods and results: Edge dissections were defined as disruptions of the luminal surface in the 5 mm segments proximal and distal to the stent, and categorised as flaps, cavities, double-lumen dissections or fissures. Qualitative and quantitative OCT analyses were performed every 0.5 mm at baseline and one year, and clinical outcomes were assessed. Sixty-three lesions (57 patients) were studied with OCT at baseline and one-year follow-up. Twenty-two non-flow-limiting edge dissections in 21 lesions (20 patients) were identified by OCT; only two (9%) were angiographically visible. Flaps were found in 96% of cases. The median longitudinal dissection length was 2.9 mm (interquartile range [IQR] 1.6-4.2 mm), whereas the circumferential and axial extensions amounted to 1.2 mm (IQR: 0.9-1.7 mm) and 0.6 mm (IQR: 0.4-0.7 mm), respectively. Dissections extended into the media and adventitia in seven (33%) and four (20%) cases, respectively. Eighteen (82%) OCT-detected edge dissections were also evaluated with intravascular ultrasound which identified nine (50%) of these OCT-detected dissections. No stent thrombosis or target lesion revascularisation occurred up to one year. At follow-up, 20 (90%) edge dissections were completely healed on OCT. The two cases exhibiting persistent dissection had the longest flaps (2.81 mm and 2.42 mm) at baseline. Conclusions: OCT-detected edge dissections which are angiographically silent in the majority of cases are not associated with acute stent thrombosis or restenosis up to one-year follow-up.
Resumo:
A cohort study study design was used to study the relationship of maternal low birthweight and infant low birthweight among African American women delivering full term infants. The cohort consisted of 3,157 mother-infant pairs drawn from the 1988 National Maternal and Infant Health Survey conducted by the National Center for Health Statistics. The objectives of the study were (1) to determine if low birthweight, African American mothers delivering term infants experienced higher rates of infant low birthweight and (2) to examine the role of selected contributory variables in the relationship of maternal low birthweight and infant low birthweight. Contributory risk factors examined included maternal marital status, maternal age, maternal education, maternal height, maternal prepregnant weight, birth order, history of a prior low birthweight delivery, timing of prenatal care, number of prenatal visits, gestational length, infant gender, and behavioral factors of smoking, alcohol, and illicit drug use during pregnancy.^ Using logistic regression analysis, risk of infant low birthweight among maternal low birthweight mothers increased after controlling for less than a high school education, less than 20 years of age, prepregnant weight less than 100 lbs, history of a prior low birthweight delivery, birth order, smoking during pregnancy, and use of alcohol and illicit drugs during pregnancy, but was not statistically significant. Loss of statistical significance was attributed to a large reduction in cases available for analysis after including illicit drug use in the model.^ This study demonstrated a consistent pattern of increased rates of infant low birthweight among low birthweight mothers. The force of history remains, hence women with this trait should be carefully monitored and advised during pregnancy to decrease risk of a low birthweight infant, in order to decrease the chain of events leading to future generations of low birthweight mothers. ^
Resumo:
Varved lake sediments are excellent natural archives providing quantitative insights into climatic and environmental changes at very high resolution and chronological accuracy. However, due to the multitude of responses within lake ecosystems it is often difficult to understand how climate variability interacts with other environmental pressures such as eutrophication, and to attribute observed changes to specific causes. This is particularly challenging during the past 100 years when multiple strong trends are superposed. Here we present a high-resolution multi-proxy record of sedimentary pigments and other biogeochemical data from the varved sediments of Lake Żabińskie (Masurian Lake District, north-eastern Poland, 54°N–22°E, 120 m a.s.l.) spanning AD 1907 to 2008. Lake Żabińskie exhibits biogeochemical varves with highly organic late summer and winter layers separated by white layers of endogenous calcite precipitated in early summer. The aim of our study is to investigate whether climate-driven changes and anthropogenic changes can be separated in a multi-proxy sediment data set, and to explore which sediment proxies are potentially suitable for long quantitative climate reconstructions. We also test if convoluted analytical techniques (e.g. HPLC) can be substituted by rapid scanning techniques (visible reflectance spectroscopy VIS-RS; 380–730 nm). We used principal component analysis and cluster analysis to show that the recent eutrophication of Lake Żabińskie can be discriminated from climate-driven changes for the period AD 1907–2008. The eutrophication signal (PC1 = 46.4%; TOC, TN, TS, Phe-b, high TC/CD ratios total carotenoids/chlorophyll-a derivatives) is mainly expressed as increasing aquatic primary production, increasing hypolimnetic anoxia and a change in the algal community from green algae to blue-green algae. The proxies diagnostic for eutrophication show a smooth positive trend between 1907 and ca 1980 followed by a very rapid increase from ca. 1980 ± 2 onwards. We demonstrate that PC2 (24.4%, Chl-a-related pigments) is not affected by the eutrophication signal, but instead is sensitive to spring (MAM) temperature (r = 0.63, pcorr < 0.05, RMSEP = 0.56 °C; 5-yr filtered). Limnological monitoring data (2011–2013) support this finding. We also demonstrate that scanning visible reflectance spectroscopy (VIS-RS) data can be calibrated to HPLC-measured chloropigment data and be used to infer concentrations of sedimentary Chl-a derivatives {pheophytin a + pyropheophytin a}. This offers the possibility for very high-resolution (multi)millennial-long paleoenvironmental reconstructions.
Resumo:
BACKGROUND No data exist on the patterns of biochemical recurrence (BCR) and their effect on survival in patients with high-risk prostate cancer (PCa) treated with surgery. The aim of our investigation was to evaluate the natural history of PCa in patients treated with radical prostatectomy (RP) alone. MATERIALS AND METHODS Overall, 2,065 patients with high-risk PCa treated with RP at 7 tertiary referral centers between 1991 and 2011 were identified. First, we calculated the probability of experiencing BCR after surgery. Particularly, we relied on conditional survival estimates for BCR after RP. Competing-risks regression analyses were then used to evaluate the effect of time to BCR on the risk of cancer-specific mortality (CSM). RESULTS Median follow-up was 70 months. Overall, the 5-year BCR-free survival rate was 55.2%. Given the BCR-free survivorship at 1, 2, 3, 4, and 5 years, the BCR-free survival rates improved by+7.6%,+4.1%,+4.8%,+3.2%, and+3.7%, respectively. Overall, the 10-year CSM rate was 14.8%. When patients were stratified according to time to BCR, patients experiencing BCR within 36 months from surgery had higher 10-year CSM rates compared with those experiencing late BCR (19.1% vs. 4.4%; P<0.001). At multivariate analyses, time to BCR represented an independent predictor of CSM (P<0.001). CONCLUSIONS Increasing time from surgery is associated with a reduction of the risk of subsequent BCR. Additionally, time to BCR represents a predictor of CSM in these patients. These results might help provide clinicians with better follow-up strategies and more aggressive treatments for early BCR.
Resumo:
The investigation of the consequences of new technologies has a long standing tradition within economics. Particularly, labor economists are wondering how the introduction of new technologies, e.g. Personal Computers, have shaped labor markets. Former research has concentrated on the question of whether on-the-job use of PCs creates a wage bonus for employees. In this paper, we investigate whether the use of PCs increases employees’ probability of an upward shift in their employment status and whether it reduces the risk of involuntary labor market exits. We do so by applying event history analysis to the Swiss Labor Market Survey, a random sample of 3028 respondents, and by analyzing a Panel sub-sample of 650 respondents conducted recently in Switzerland. Our results show that on-the-job use of PCs was beneficial for employees in the past by increasing their probability of an upward shift by approximately 50%. The analysis also suggests that PC use reduces the risk and duration of unemployment. However, these latter results fail to reach statistical significance.
Resumo:
The study of mass movements in lake sediments provides insights into past natural hazards at historic and prehistoric timescales. Sediments from the deep basin of Lake Geneva reveal a succession of six large-scale (volumes of 22 × 106 to 250 × 106 m3) mass-transport deposits, associated with five mass-movement events within 2600 years (4000 cal bp to 563 ad). The mass-transport deposits result from: (i) lateral slope failures (mass-transport deposit B at 3895 ± 225 cal bp and mass-transport deposits A and C at 3683 ± 128 cal bp); and (ii) Rhône delta collapses (mass-transport deposits D to G dated at 2650 ± 150 cal bp, 2185 ± 85 cal bp, 1920 ± 120 cal bp and 563 ad, respectively). Mass-transport deposits A and C were most probably triggered by an earthquake, whereas the Rhône delta collapses were likely to be due to sediment overload with a rockfall as the external trigger (mass-transport deposit G, the Tauredunum event in 563 ad known from historical records), an earthquake (mass-transport deposit E) or unknown external triggers (mass-transport deposits D and F). Independent of their origin and trigger mechanisms, numerical simulations show that all of these recorded mass-transport deposits are large enough to have generated at least metre-scale tsunamis during mass movement initiation. Since the Tauredunum event in 563 ad, two small-scale (volumes of 1 to 2 × 106 m3) mass-transport deposits (H and I) are present in the seismic record, both of which are associated with small lateral slope failures. Mass-transport deposits H and I might be related to earthquakes in Lausanne/Geneva (possibly) 1322 ad and Aigle 1584 ad, respectively. The sedimentary record of the deep basin of Lake Geneva, in combination with the historical record, show that during the past 3695 years, at least six tsunamis were generated by mass movements, indicating that the tsunami hazard in the Lake Geneva region should not be neglected, although such events are not frequent with a recurrence time of 0·0016 yr−1.
Resumo:
BACKGROUND/AIMS Controversies still exist regarding the evaluation of growth hormone deficiency (GHD) in childhood at the end of growth. The aim of this study was to describe the natural history of GHD in a pediatric cohort. METHODS This is a retrospective study of a cohort of pediatric patients with GHD. Cases of acquired GHD were excluded. Univariate logistic regression was used to identify predictors of GHD persisting into adulthood. RESULTS Among 63 identified patients, 47 (75%) had partial GHD at diagnosis, while 16 (25%) had complete GHD, including 5 with multiple pituitary hormone deficiencies. At final height, 50 patients underwent repeat stimulation testing; 28 (56%) recovered and 22 (44%) remained growth hormone (GH) deficient. Predictors of persisting GHD were: complete GHD at diagnosis (OR 10.1, 95% CI 2.4-42.1), pituitary stalk defect or ectopic pituitary gland on magnetic resonance imaging (OR 6.5, 95% CI 1.1-37.1), greater height gain during GH treatment (OR 1.8, 95% CI 1.0-3.3), and IGF-1 level <-2 standard deviation scores (SDS) following treatment cessation (OR 19.3, 95% CI 3.6-103.1). In the multivariate analysis, only IGF-1 level <-2 SDS (OR 13.3, 95% CI 2.3-77.3) and complete GHD (OR 6.3, 95% CI 1.2-32.8) were associated with the outcome. CONCLUSION At final height, 56% of adolescents with GHD had recovered. Complete GHD at diagnosis, low IGF-1 levels following retesting, and pituitary malformation were strong predictors of persistence of GHD.
Resumo:
PURPOSE The Geographic Atrophy Progression (GAP) study was designed to assess the rate of geographic atrophy (GA) progression and to identify prognostic factors by measuring the enlargement of the atrophic lesions using fundus autofluorescence (FAF) and color fundus photography (CFP). DESIGN Prospective, multicenter, noninterventional natural history study. PARTICIPANTS A total of 603 participants were enrolled in the study; 413 of those had gradable lesion data from FAF or CFP, and 321 had gradable lesion data from both FAF and CFP. METHODS Atrophic lesion areas were measured by FAF and CFP to assess lesion progression over time. Lesion size assessments and best-corrected visual acuity (BCVA) were conducted at screening/baseline (day 0) and at 3 follow-up visits: month 6, month 12, and month 18 (or early exit). MAIN OUTCOME MEASURES The GA lesion progression rate in disease subgroups and mean change from baseline visual acuity. RESULTS Mean (standard error) lesion size changes from baseline, determined by FAF and CFP, respectively, were 0.88 (0.1) and 0.78 (0.1) mm(2) at 6 months, 1.85 (0.1) and 1.57 (0.1) mm(2) at 12 months, and 3.14 (0.4) and 3.17 (0.5) mm(2) at 18 months. The mean change in lesion size from baseline to month 12 was significantly greater in participants who had eyes with multifocal atrophic spots compared with those with unifocal spots (P < 0.001) and those with extrafoveal lesions compared with those with foveal lesions (P = 0.001). The mean (standard deviation) decrease in visual acuity was 6.2 ± 15.6 letters for patients with image data available. Atrophic lesions with a diffuse (mean 0.95 mm(2)) or banded (mean 1.01 mm(2)) FAF pattern grew more rapidly by month 6 compared with those with the "none" (mean, 0.13 mm(2)) and focal (mean, 0.36 mm(2)) FAF patterns. CONCLUSIONS Although differences were observed in mean lesion size measurements using FAF imaging compared with CFP, the measurements were highly correlated with one another. Significant differences were found in lesion progression rates in participants stratified by hyperfluorescence pattern subtype. This large GA natural history study provides a strong foundation for future clinical trials.
Resumo:
The geologic history of the multi-ringed Argyre impact basin and surroundings has been reconstructed on the basis of geologic mapping and relative-age dating of rock materials and structures. The impact formed a primary basin, rim materials, and a complex basement structural fabric including faults and valleys that are radial and concentric about the primary basin, as well as structurally-controlled local basins. Since its formation, the basin has been a regional catchment for volatiles and sedimentary materials as well as a dominant influence on the flow of surface ice, debris flows, and groundwater through and over its basement structures. The basin is interpreted to have been occupied by lakes, including a possible Mediterranean-sized sea that formed in the aftermath of the Argyre impact event The hypothesized lakes froze and diminished through time, though liquid water may have remained beneath the ice cover and sedimentation may have continued for some time. At its deepest, the main Argyre lake may have taken more than a hundred thousand years to freeze to the bottom even absent any heat source besides the Sun, but with impact-induced hydrothermal heat, geothermal heat flow due to long-lived radioactivities in early martian history, and concentration of solutes in sub-ice brine, liquid water may have persisted beneath thick ice for many millions of years. Existence of an ice-covered sea perhaps was long enough for life to originate and evolve with gradually colder and more hypersaline conditions. The Argyre rock materials, diverse in origin and emplacement mechanisms, have been modified by impact, magmatic, eolian, fluvial, lacustrine, glacial, periglacial, alluvial, colluvial, and tectonic processes. Post-impact adjustment of part of the impact-generated basement structural fabric such as concentric faults is apparent. Distinct basin-stratigraphic units are interpreted to be linked to large-scale geologic activity far from the basin, including growth of the Tharsis magmatic-tectonic complex and the growth into southern middle latitudes of south polar ice sheets. Along with the migration of surface and sub-surface volatiles towards the central part of the primaiy basin, the substantial difference in elevation with respect to the surrounding highlands and Tharsis and the Thaumasia highlands result in the trapping of atmospheric volatiles within the basin in the form of fog and regional or local precipitation, even today. In addition, the impact event caused long-term (millions of years) hydrothermal activity, as well as deep-seated basement structures that have tapped the internal heat of Mars, as conduits, for far greater time, possibly even today. This possibility is raised by the observation of putative open-system pingos and nearby gullies that occur in linear depressions with accompanying systems of faults and fractures. Long-term water and heat energy enrichment, complemented by the interaction of the nutrient-enriched primordial crustal and mantle materials favorable to life excavated to the surface and near-surface environs through the Argyre impact event, has not only resulted in distinct geomorphology, but also makes the Argyre basin a potential site of exceptional astrobiological significance. (C) 2015 Elsevier Inc. All rights reserved.
Resumo:
Knowledge about vegetation and fire history of the mountains of Northern Sicily is scanty. We analysed five sites to fill this gap and used terrestrial plant macrofossils to establish robust radiocarbon chronologies. Palynological records from Gorgo Tondo, Gorgo Lungo, Marcato Cixé, Urgo Pietra Giordano and Gorgo Pollicino show that under natural or near natural conditions, deciduous forests (Quercus pubescens, Q. cerris, Fraxinus ornus, Ulmus), that included a substantial portion of evergreen broadleaved species (Q. suber, Q. ilex, Hedera helix), prevailed in the upper meso-mediterranean belt. Mesophilous deciduous and evergreen broadleaved trees (Fagus sylvatica, Ilex aquifolium) dominated in the natural or quasi-natural forests of the oro-mediterranean belt. Forests were repeatedly opened for agricultural purposes. Fire activity was closely associated with farming, providing evidence that burning was a primary land use tool since Neolithic times. Land use and fire activity intensified during the Early Neolithic at 5000 bc, at the onset of the Bronze Age at 2500 bc and at the onset of the Iron Age at 800 bc. Our data and previous studies suggest that the large majority of open land communities in Sicily, from the coastal lowlands to the mountain areas below the thorny-cushion Astragalus belt (ca. 1,800 m a.s.l.), would rapidly develop into forests if land use ceased. Mesophilous Fagus-Ilex forests developed under warm mid Holocene conditions and were resilient to the combined impacts of humans and climate. The past ecology suggests a resilience of these summer-drought adapted communities to climate warming of about 2 °C. Hence, they may be particularly suited to provide heat and drought-adapted Fagus sylvatica ecotypes for maintaining drought-sensitive Central European beech forests under global warming conditions.
Resumo:
The natural history of placebo treated travelers' diarrhea and the prognostic factors of recovery from diarrhea were evaluated using 9 groups of placebo treated subjects from 9 clinical trial studies conducted since 1975, for use as a historical control in the future clinical trial of antidiarrheal agents. All of these studies were done by the same group of investigators in one site (Guadalajara, Mexico). The studies are similar in terms of population, measured parameters, microbiologic identification of enteropathogens and definitions of parameters. The studies had two different durations of followup. In some studies, subjects were followed for two days, and in some they were followed for five days.^ Using definitions established by the Infectious Diseases society of America and the Food and Drug Administration, the following efficacy parameters were evaluated: Time to last unformed stool (TLUS), number of unformed stools post-initiation of placebo treatment for five consecutive days of followup, microbiologic cure, and improvement of diarrhea. Among the groups that were followed for five days, the mean TLUS ranged from 59.1 to 83.5 hours. Fifty percent to 78% had diarrhea lasting more than 48 hours and 25% had diarrhea more than five days. The mean number of unformed stools passed on the first day post-initiation of therapy ranged from 3.6 to 5.8 and, for the fifth day ranged from 0.5 to 1.5. By the end of followup, diarrhea improved in 82.6% to 90% of the subjects. Subjects with enterotoxigenic E. coli had 21.6% to 90.0% microbiologic cure; and subjects with shigella species experienced 14.3% to 60.0% microbiologic cure.^ In evaluating the prognostic factors of recovery from diarrhea (primary efficacy parameter in evaluating the efficacy of antidiarrheal agents against travelers' diarrhea). The subjects from five studies were pooled and the Cox proportional hazard model was used to evaluate the predictors of prolonged diarrhea. After adjusting for design characteristics of each trial, fever with a rate ratio (RR) of 0.40, presence of invasive pathogens with a RR of 0.41, presence of severe abdominal pain and cramps with a RR of 0.50, number of watery stools more than five with a RR of 0.60, and presence of non-invasive pathogens with a RR of 0.84 predicted a longer duration of diarrhea. Severe vomiting with a RR of 2.53 predicted a shorter duration of diarrhea. The number of soft stools, presence of fecal leukocytes, presence of nausea, and duration of diarrhea before enrollment were not associated with duration of diarrhea. ^
Resumo:
This invited commentary reviews the survey research described in "Examining the Relationship between Media use and Aggression, Sexuality, and Body Image" and situates this research within the recent history of entertainment media regulation.