7 resultados para BY-EVENT FLUCTUATIONS

em Digital Commons - Michigan Tech


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Isolated water-soluble analytes extracted from fog water collected during a radiation fog event near Fresno, CA were analyzed using collision induced dissociation and ultrahigh-resolution mass spectrometry. Tandem mass analysis was performed on scan ranges between 100-400 u to characterize the structures of nitrogen and/or sulfur containing species. CHNO, CHOS, and CHNOS compounds were targeted specifically because of the high number of oxygen atoms contained in their molecular formulas. The presence of 22 neutral losses corresponding to fragment ions was evaluated for each of the 1308 precursors. Priority neutral losses represent specific polar functional groups (H2O, CO2, CH3OH, HNO3, SO3, etc., and several combinations of these). Additional neutral losses represent non-specific functional groups (CO, CH2O, C3H8, etc.) Five distinct monoterpene derived organonitrates, organosulfates, and nitroxy-organosulfates were observed in this study, including C10H16O7S, C10H17NO7S, C10H17 NO8S, C10H17NO9S, and C10H17NO10S. Nitrophenols and linear alkyl benzene sulfonates were present in high abundance. Liquid chromatography/mass spectrometery methodology was developed to isolate and quantify nitrophenols based on their fragmentation behavior.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Turbulence affects traditional free space optical communication by causing speckle to appear in the received beam profile. This occurs due to changes in the refractive index of the atmosphere that are caused by fluctuations in temperature and pressure, resulting in an inhomogeneous medium. The Gaussian-Schell model of partial coherence has been suggested as a means of mitigating these atmospheric inhomogeneities on the transmission side. This dissertation analyzed the Gaussian-Schell model of partial coherence by verifying the Gaussian-Schell model in the far-field, investigated the number of independent phase control screens necessary to approach the ideal Gaussian-Schell model, and showed experimentally that the Gaussian-Schell model of partial coherence is achievable in the far-field using a liquid crystal spatial light modulator. A method for optimizing the statistical properties of the Gaussian-Schell model was developed to maximize the coherence of the field while ensuring that it does not exhibit the same statistics as a fully coherent source. Finally a technique to estimate the minimum spatial resolution necessary in a spatial light modulator was developed to effectively propagate the Gaussian-Schell model through a range of atmospheric turbulence strengths. This work showed that regardless of turbulence strength or receiver aperture, transmitting the Gaussian-Schell model of partial coherence instead of a fully coherent source will yield a reduction in the intensity fluctuations of the received field. By measuring the variance of the intensity fluctuations and the received mean, it is shown through the scintillation index that using the Gaussian-Schell model of partial coherence is a simple and straight forward method to mitigate atmospheric turbulence instead of traditional adaptive optics in free space optical communications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Statistical analyses of temporal relationships between large earthquakes and volcanic eruptions suggest seismic waves may trigger eruptions even over great (>1000 km) distances, although the causative mechanism is not well constrained. In this study the relationship between large earthquakes and subtle changes in volcanic activity was investigated in order to gain greater insight into the relationship between dynamic stresses propagated by surface waves and volcanic response. Daily measurements from the Ozone Monitoring Instrument (OMI), onboard the Aura satellite, provide constraints on volcanic sulfur-dioxide (SO2) emission rates as a measure of subtle changes in activity. Time series of SO2 emission rates were produced from OMI data for thirteen persistently active volcanoes from 1 October 2004 to 30 September 2010. In order to quantify the affect of earthquakes at teleseismic distances, we modeled surface-wave amplitudes from the source mechanisms of moment magnitude (Mw) ≥7 earthquakes, and calculated the Peak Dynamic Stress (PDS). We assessed the influence of earthquakes on volcanic activity in two ways: 1) by identifying increases in the SO2 time series data and looking for causative earthquakes and 2) by examining the average emission rate before and after each earthquake. In the first, the SO2 time series for each volcano was used to calculate a baseline threshold for comparison with post-earthquake emission. Next, we generated a catalog of responses based on sustained SO2 emission increases above this baseline. Delay times between each SO2 response and each prior earthquake were analyzed using both the actual earthquake catalog, and a randomly generated catalog of earthquakes. This process was repeated for each volcano. Despite varying multiple parameters, this analysis did not demonstrate a clear relationship between earthquake-generated PDS and SO2 emission. However, the second analysis, which was based on the occurrence of large earthquakes indicated a response at most volcanoes. Using the PDS calculations as a filtering criterion for the earthquake catalog, the SO2 mass for each volcano was analyzed in 28-day windows centered on the earthquake origin time. If the average SO2 mass after the earthquake was greater than an arbitrary percentage of pre-earthquake mass, we identified the volcano as having a response to the event. This window analysis provided insight on what type of volcanic activity is more susceptible to triggering by dynamic stress. The volcanoes with very open systems included in this study, Ambrym, Gaua, Villarrica, Erta Ale and, Turrialba, showed a clear response to dynamic stress while the volcanoes with more closed systems, Merapi, Semeru, Fuego, Pacaya, and Bagana, showed no response.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the project, managers encounter numerous contingencies and are faced with the challenging task of making decisions that will effectively keep the project on track. This task is very challenging because construction projects are non-prototypical and the processes are irreversible. Therefore, it is critical to apply a methodological approach to develop a few alternative management decision strategies during the planning phase, which can be deployed to manage alternative scenarios resulting from expected and unexpected disruptions in the as-planned schedule. Such a methodology should have the following features but are missing in the existing research: (1) looking at the effects of local decisions on the global project outcomes, (2) studying how a schedule responds to decisions and disruptive events because the risk in a schedule is a function of the decisions made, (3) establishing a method to assess and improve the management decision strategies, and (4) developing project specific decision strategies because each construction project is unique and the lessons from a particular project cannot be easily applied to projects that have different contexts. The objective of this dissertation is to develop a schedule-based simulation framework to design, assess, and improve sequences of decisions for the execution stage. The contribution of this research is the introduction of applying decision strategies to manage a project and the establishment of iterative methodology to continuously assess and improve decision strategies and schedules. The project managers or schedulers can implement the methodology to develop and identify schedules accompanied by suitable decision strategies to manage a project at the planning stage. The developed methodology also lays the foundation for an algorithm towards continuously automatically generating satisfactory schedule and strategies through the construction life of a project. Different from studying isolated daily decisions, the proposed framework introduces the notion of {em decision strategies} to manage construction process. A decision strategy is a sequence of interdependent decisions determined by resource allocation policies such as labor, material, equipment, and space policies. The schedule-based simulation framework consists of two parts, experiment design and result assessment. The core of the experiment design is the establishment of an iterative method to test and improve decision strategies and schedules, which is based on the introduction of decision strategies and the development of a schedule-based simulation testbed. The simulation testbed used is Interactive Construction Decision Making Aid (ICDMA). ICDMA has an emulator to duplicate the construction process that has been previously developed and a random event generator that allows the decision-maker to respond to disruptions in the emulation. It is used to study how the schedule responds to these disruptions and the corresponding decisions made over the duration of the project while accounting for cascading impacts and dependencies between activities. The dissertation is organized into two parts. The first part presents the existing research, identifies the departure points of this work, and develops a schedule-based simulation framework to design, assess, and improve decision strategies. In the second part, the proposed schedule-based simulation framework is applied to investigate specific research problems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measuring shallow seismic sources provides a way to reveal processes that cannot be directly observed, but the correct interpretation and value of these signals depend on the ability to distinguish source from propagation effects. Furthermore, seismic signals produced by a resonating source can look almost identical to those produced by impulsive sources, but modified along the path. Distinguishing these two phenomena can be accomplished by examining the wavefield with small aperture arrays or by recording seismicity near to the source when possible. We examine source and path effects in two different environments: Bering Glacier, Alaska and Villarrica Volcano, Chile. Using three 3-element seismic arrays near the terminus of the Bering Glacier, we have identified and located both terminus calving and iceberg breakup events. We show that automated array analysis provided a robust way to locate icequake events using P waves. This analysis also showed that arrivals within the long-period codas were incoherent within the small aperture arrays, demonstrating that these codas previously attributed to crack resonance were in fact a result of a complicated path rather than a source effect. At Villarrica Volcano, seismometers deployed from near the vent to ~10 km revealed that a several cycle long-period source signal recorded at the vent appeared elongated in the far-field. We used data collected from the stations nearest to the vent to invert for the repetitive seismic source, and found it corresponded to a shallow force within the lava lake oriented N75°E and dipping 7° from horizontal. We also used this repetitive signal to search the data for additional seismic and infrasonic properties which included calculating seismic-acoustic delay times, volcano acoustic-seismic ratios and energies, event frequency, and real-time seismic amplitude measurements. These calculations revealed lava lake level and activity fluctuations consistent with lava lake level changes inferred from the persistent infrasonic tremor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation represents experimental and numerical investigations of combustion initiation trigged by electrical-discharge-induced plasma within lean and dilute methane air mixture. This research topic is of interest due to its potential to further promote the understanding and prediction of spark ignition quality in high efficiency gasoline engines, which operate with lean and dilute fuel-air mixture. It is specified in this dissertation that the plasma to flame transition is the key process during the spark ignition event, yet it is also the most complicated and least understood procedure. Therefore the investigation is focused on the overlapped periods when plasma and flame both exists in the system. Experimental study is divided into two parts. Experiments in Part I focuses on the flame kernel resulting from the electrical discharge. A number of external factors are found to affect the growth of the flame kernel, resulting in complex correlations between discharge and flame kernel. Heat loss from the flame kernel to code ambient is found to be a dominant factor that quenches the flame kernel. Another experimental focus is on the plasma channel. Electrical discharges into gases induce intense and highly transient plasma. Detailed observation of the size and contents of the discharge-induced plasma channel is performed. Given the complex correlation and the multi-discipline physical/chemical processes involved in the plasma-flame transition, the modeling principle is taken to reproduce detailed transitions numerically with minimum analytical assumptions. Detailed measurement obtained from experimental work facilitates the more accurate description of initial reaction conditions. The novel and unique spark source considering both energy and species deposition is defined in a justified manner, which is the key feature of this Ignition by Plasma (IBP) model. The results of numerical simulation are intuitive and the potential of numerical simulation to better resolve the complex spark ignition mechanism is presented. Meanwhile, imperfections of the IBP model and numerical simulation have been specified and will address future attentions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Space-based (satellite, scientific probe, space station, etc.) and millimeter – to – microscale (such as are used in high power electronics cooling, weapons cooling in aircraft, etc.) condensers and boilers are shear/pressure driven. They are of increasing interest to system engineers for thermal management because flow boilers and flow condensers offer both high fluid flow-rate-specific heat transfer capacity and very low thermal resistance between the fluid and the heat exchange surface, so large amounts of heat may be removed using reasonably-sized devices without the need for excessive temperature differences. However, flow stability issues and degradation of performance of shear/pressure driven condensers and boilers due to non-desirable flow morphology over large portions of their lengths have mostly prevented their use in these applications. This research is part of an ongoing investigation seeking to close the gap between science and engineering by analyzing two key innovations which could help address these problems. First, it is recommended that the condenser and boiler be operated in an innovative flow configuration which provides a non-participating core vapor stream to stabilize the annular flow regime throughout the device length, accomplished in an energy-efficient manner by means of ducted vapor re-circulation. This is demonstrated experimentally. Second, suitable pulsations applied to the vapor entering the condenser or boiler (from the re-circulating vapor stream) greatly reduce the thermal resistance of the already effective annular flow regime. For experiments reported here, application of pulsations increased time-averaged heat-flux up to 900 % at a location within the flow condenser and up to 200 % at a location within the flow boiler, measured at the heat-exchange surface. Traditional fully condensing flows, reported here for comparison purposes, show similar heat-flux enhancements due to imposed pulsations over a range of frequencies. Shear/pressure driven condensing and boiling flow experiments are carried out in horizontal mm-scale channels with heat exchange through the bottom surface. The sides and top of the flow channel are insulated. The fluid is FC-72 from 3M Corporation.