5 resultados para Magnitude of the soul
em Digital Commons - Michigan Tech
Resumo:
The time course of lake recovery after a reduction in external loading of nutrients is often controlled by conditions in the sediment. Remediation of eutrophication is hindered by the presence of legacy organic carbon deposits, that exert a demand on the terminal electron acceptors of the lake and contribute to problems such as internal nutrient recycling, absence of sediment macrofauna, and flux of toxic metal species into the water column. Being able to quantify the timing of a lake’s response requires determination of the magnitude and lability, i.e., the susceptibility to biodegradation, of the organic carbon within the legacy deposit. This characterization is problematic for organic carbon in sediments because of the presence of different fractions of carbon, which vary from highly labile to refractory. The lability of carbon under varied conditions was tested with a bioassay approach. It was found that the majority of the organic material found in the sediments is conditionally-labile, where mineralization potential is dependent on prevailing conditions. High labilities were noted under oxygenated conditions and a favorable temperature of 30 °C. Lability decreased when oxygen was removed, and was further reduced when the temperature was dropped to the hypolimnetic average of 8° C . These results indicate that reversible preservation mechanisms exist in the sediment, and are able to protect otherwise labile material from being mineralized under in situ conditions. The concept of an active sediment layer, a region in the sediments in which diagenetic reactions occur (with nothing occurring below it), was examined through three lines of evidence. Initially, porewater profiles of oxygen, nitrate, sulfate/total sulfide, ETSA (Electron Transport System Activity- the activity of oxygen, nitrate, iron/manganese, and sulfate), and methane were considered. It was found through examination of the porewater profiles that the edge of diagenesis occurred around 15-20 cm. Secondly, historical and contemporary TOC profiles were compared to find the point at which the profiles were coincident, indicating the depth at which no change has occurred over the (13 year) interval between core collections. This analysis suggested that no diagenesis has occurred in Onondaga Lake sediment below a depth of 15 cm. Finally, the time to 99% mineralization, the t99, was viewed by using a literature estimate of the kinetic rate constant for diagenesis. A t99 of 34 years, or approximately 30 cm of sediment depth, resulted for the slowly decaying carbon fraction. Based on these three lines of evidence , an active sediment layer of 15-20 cm is proposed for Onondaga Lake, corresponding to a time since deposition of 15-20 years. While a large legacy deposit of conditionally-labile organic material remains in the sediments of Onondaga Lake, it becomes clear that preservation, mechanisms that act to shield labile organic carbon from being degraded, protects this material from being mineralized and exerting a demand on the terminal electron acceptors of the lake. This has major implications for management of the lake, as it defines the time course of lake recovery following a reduction in nutrient loading.
Resumo:
The High-Altitude Water Cherenkov (HAWC) Experiment is a gamma-ray observatory that utilizes water silos as Cherenkov detectors to measure the electromagnetic air showers created by gamma rays. The experiment consists of an array of closely packed water Cherenkov detectors (WCDs), each with four photomultiplier tubes (PMTs). The direction of the gamma ray will be reconstructed using the times when the electromagnetic shower front triggers PMTs in each WCD. To achieve an angular resolution as low as 0.1 degrees, a laser calibration system will be used to measure relative PMT response times. The system will direct 300ps laser pulses into two fiber-optic networks. Each network will use optical fan-outs and switches to direct light to specific WCDs. The first network is used to measure the light transit time out to each pair of detectors, and the second network sends light to each detector, calibrating the response times of the four PMTs within each detector. As the relative PMT response times are dependent on the number of photons in the light pulse, neutral density filters will be used to control the light intensity across five orders of magnitude. This system will run both continuously in a low-rate mode, and in a high-rate mode with many intensity levels. In this thesis, the design of the calibration system and systematic studies verifying its performance are presented.
Resumo:
In this report we will investigate the effect of negative energy density in a classic Friedmann cosmology. Although never measured and possibly unphysical, the evolution of a Universe containing a significant cosmological abundance of any of a number of hypothetical stable negative energy components is explored. These negative energy (Ω < 0) forms include negative phantom energy (w<-1), negative cosmological constant (w=-1), negative domain walls (w=-2/3), negative cosmic strings (w= -1/3), negative mass (w=0), negative radiation (w=1/3), and negative ultra-light (w > 1/3). Assuming that such universe components generate pressures as perfect fluids, the attractive or repulsive nature of each negative energy component is reviewed. The Friedmann equations can only be balanced when negative energies are coupled to a greater magnitude of positive energy or positive curvature, and minimal cases of both of these are reviewed. The future and fate of such universes in terms of curvature, temperature, acceleration, and energy density are reviewed including endings categorized as a Big Crunch, Big Void, or Big Rip and further qualified as "Warped", "Curved", or "Flat", "Hot" versus "Cold", "Accelerating" versus" Decelerating" versus "Coasting". A universe that ends by contracting to zero energy density is termed a Big Poof. Which contracting universes ``bounce" in expansion and which expanding universes ``turnover" into contraction are also reviewed. The name by which the ending of the Universe is mentioned is our own nomenclature.
Resumo:
Nitrogen oxides play a crucial role in the budget of tropospheric ozone (O sub(3)) and the formation of the hydroxyl radical. Anthropogenic activities and boreal wildfires are large sources of emissions in the atmosphere. However, the influence of the transport of these emissions on nitrogen oxides and O sub(3) levels at hemispheric scales is not well understood, in particular due to a lack of nitrogen oxides measurements in remote regions. In order to address these deficiencies, measurements of NO, NO sub(2) and NO sub(y) (total reactive nitrogen oxides) were made in the lower free troposphere (FT) over the central North Atlantic region (Pico Mountain station, 38 degree N 28 degree W, 2.3 km asl) from July 2002 to August 2005. These measurements reveal a well-defined seasonal cycle of nitrogen oxides (NO sub(x) = NO+NO sub(2) and NO sub(y)) in the background central North Atlantic lower FT, with higher mixing ratios during the summertime. Observed NO sub(x) and NO sub(y) levels are consistent with long-range transport of emissions, but with significant removal en-route to the measurement site. Reactive nitrogen largely exists in the form of PAN and HNO sub(3) ( similar to 80-90% of NO sub(y)) all year round. A shift in the composition of NO sub(y) from dominance of PAN to dominance of HNO sub(3) occurs from winter-spring to summer-fall, as a result of changes in temperature and photochemistry over the region. Analysis of the long-range transport of boreal wildfire emissions on nitrogen oxides provides evidence of the very large-scale impacts of boreal wildfires on the tropospheric NO sub(x) and O sub(3) budgets. Boreal wildfire emissions are responsible for significant shifts in the nitrogen oxides distributions toward higher levels during the summer, with medians of NO sub(y) (117-175 pptv) and NO sub(x) (9-30 pptv) greater in the presence of boreal wildfire emissions. Extreme levels of NO sub(x) (up to 150 pptv) and NO sub(y) (up to 1100 pptv) observed in boreal wildfire plumes suggest that decomposition of PAN to NO sub(x) is a significant source of NO sub(x), and imply that O sub(3) formation occurs during transport. Ozone levels are also significantly enhanced in boreal wildfire plumes. However, a complex behavior of O sub(3) is observed in the plumes, which varies from significant to lower O sub(3) production to O sub(3) destruction. Long-range transport of anthropogenic emissions from North America also has a significant influence on the regional NO sub(x) and O sub(3) budgets. Transport of pollution from North America causes significant enhancements on nitrogen oxides year-round. Enhancements of CO, NO sub(y) and NO sub(x) indicate that, consistent with previous studies, more than 95% of the NO sub(x) emitted over the U.S. is removed before and during export out of the U.S. boundary layer. However, about 30% of the NO sub(x) emissions exported out of the U.S. boundary layer remain in the airmasses. Since the lifetime of NO sub(x) is shorter than the transport timescale, PAN decomposition and potentially photolysis of HNO sub(3) provide a supply of NO sub(x) over the central North Atlantic lower FT. Observed Delta O sub(3)/ Delta NO sub(y) and large NO sub(y) levels remaining in the North American plumes suggest potential O sub(3) formation well downwind from North America. Finally, a comparison of the nitrogen oxides measurements with results from the global chemical transport (GCT) model GEOS-Chem identifies differences between the observations and the model. GEOS-Chem reproduces the seasonal variation of nitrogen oxides over the central North Atlantic lower FT, but does not capture the magnitude of the cycles. Improvements in our understanding of nitrogen oxides chemistry in the remote FT and emission sources are necessary for the current GCT models to adequately estimate the impacts of emissions on tropospheric NO sub(x) and the resulting impacts on the O sub(3) budget.
Resumo:
Large earthquakes may strongly influence the activity of volcanoes through static and dynamic processes. In this study, we quantify the static and dynamic stress change on 27 volcanoes in Central America, after the Mw 7.6 Costa Rica earthquake of 5 September 2012. Following this event, 8 volcanoes showed signs of activity. We calculated the static stress change due to the earthquake on hypothetical faults under these volcanoes with Coulomb 3.3. For the dynamic stress change, we computed synthetic seismograms to simulate the waveforms at these volcanoes. We then calculated the Peak Dynamic Stress (PDS) from the modeled peak ground velocities. The resulting values are from moderate to minor changes in stress (10-1-10-2 MPa) with the PDS values generally an order of magnitude larger than the static stress change. Although these values are small, they may be enough to trigger a response by the volcanoes, and are on the order of stress changes implicated in many other studies of volcano and earthquake triggering by large earthquakes. This study provides insight into the poorly-constrained mechanism for remote triggering.