999 resultados para MTU


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Tropospheric ozone (O3) and carbon monoxide (CO) pollution in the Northern Hemisphere is commonly thought to be of anthropogenic origin. While this is true in most cases, copious quantities of pollutants are emitted by fires in boreal regions, and the impact of these fires on CO has been shown to significantly exceed the impact of urban and industrial sources during large fire years. The impact of boreal fires on ozone is still poorly quantified, and large uncertainties exist in the estimates of the fire-released nitrogen oxides (NO x ), a critical factor in ozone production. As boreal fire activity is predicted to increase in the future due to its strong dependence on weather conditions, it is necessary to understand how these fires affect atmospheric composition. To determine the scale of boreal fire impacts on ozone and its precursors, this work combined statistical analysis of ground-based measurements downwind of fires, satellite data analysis, transport modeling and the results of chemical model simulations. The first part of this work focused on determining boreal fire impact on ozone levels downwind of fires, using analysis of observations in several-days-old fire plumes intercepted at the Pico Mountain station (Azores). The results of this study revealed that fires significantly increase midlatitude summertime ozone background during high fire years, implying that predicted future increases in boreal wildfires may affect ozone levels over large regions in the Northern Hemisphere. To improve current estimates of NOx emissions from boreal fires, we further analyzed ΔNOy /ΔCO enhancement ratios in the observed fire plumes together with transport modeling of fire emission estimates. The results of this analysis revealed the presence of a considerable seasonal trend in the fire NOx /CO emission ratio due to the late-summer changes in burning properties. This finding implies that the constant NOx /CO emission ratio currently used in atmospheric modeling is unrealistic, and is likely to introduce a significant bias in the estimated ozone production. Finally, satellite observations were used to determine the impact of fires on atmospheric burdens of nitrogen dioxide (NO2 ) and formaldehyde (HCHO) in the North American boreal region. This analysis demonstrated that fires dominated the HCHO burden over the fires and in plumes up to two days old. This finding provides insights into the magnitude of secondary HCHO production and further enhances scientific understanding of the atmospheric impacts of boreal fires.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The United States transportation industry is predicted to consume approximately 13 million barrels of liquid fuel per day by 2025. If one percent of the fuel energy were salvaged through waste heat recovery, there would be a reduction of 130 thousand barrels of liquid fuel per day. This dissertation focuses on automotive waste heat recovery techniques with an emphasis on two novel techniques. The first technique investigated was a combination coolant and exhaust-based Rankine cycle system, which utilized a patented piston-in-piston engine technology. The research scope included a simulation of the maximum mass flow rate of steam (700 K and 5.5 MPa) from two heat exchangers, the potential power generation from the secondary piston steam chambers, and the resulting steam quality within the steam chamber. The secondary piston chamber provided supplemental steam power strokes during the engine's compression and exhaust strokes to reduce the pumping work of the engine. A Class-8 diesel engine, operating at 1,500 RPM at full load, had a maximum increase in the brake fuel conversion efficiency of 3.1%. The second technique investigated the implementation of thermoelectric generators on the outer cylinder walls of a liquid-cooled internal combustion engine. The research scope focused on the energy generation, fuel energy distribution, and cylinder wall temperatures. The analysis was conducted over a range of engine speeds and loads in a two cylinder, 19.4 kW, liquid-cooled, spark-ignition engine. The cylinder wall temperatures increased by 17% to 44% which correlated well to the 4.3% to 9.5% decrease in coolant heat transfer. Only 23.3% to 28.2% of the heat transfer to the coolant was transferred through the TEG and TEG surrogate material. The gross indicated work decreased by 0.4% to 1.0%. The exhaust gas energy decreased by 0.8% to 5.9%. Due to coolant contamination, the TEG output was not able to be obtained. TEG output was predicted from cylinder wall temperatures and manufacturer documentation, which was less than 0.1% of the cumulative heat release. Higher TEG conversion efficiencies, combined with greater control of heat transfer paths, would be needed to improve energy output and make this a viable waste heat recovery technique.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The main objective of this research was to investigate pyrolysis and torrefaction of forest biomass species using a micropyrolysis instrument. It was found that 30-45% of the original sample mass remained as bio-char in the pyrolysis temperature range of 500 - 700˚C for aspen, balsam, and switchgrass. The non-char mass was converted to gaseous and vapor products, of which 10-55% was water and syngas, 2-12% to acetic acid, 2-12% to hydroxypropanone, 1-3% to furaldehyde, and 5-15% to various phenolic compounds. In addition, several general trends in the evolution of gaseous species were indentified when woody feedstocks were pyrolyzed. With increasing temperature it was observed that: (1) the volume of gas produced increased, (2) the volume of CO2 decreased and the volumes of CO and CH4 increased, and (3) the rates of gas evolution increased. In the range of torrefaction temperature (200 - 300˚C), two mechanistic models were developed to predict the rates of CO2 and acetic acid product formation. The models fit the general trend of the experimental data well, but suggestions for future improvement were also noted. Finally, it was observed that using torrefaction as a pre-curser to pyrolysis improves the quality of bio-oil over traditional pyrolysis by reducing the acidity through removal of acetic acid, reducing the O/C ratio by removal of some oxygenated species, and removing a portion of the water.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Forest trees, like oaks, rely on high levels of genetic variation to adapt to varying environmental conditions. Thus, genetic variation and its distribution are important for the long-term survival and adaptability of oak populations. Climate change is projected to lead to increased drought and fire events as well as a northward migration of tree species, including oaks. Additionally, decline in oak regeneration has become increasingly concerning since it may lead to decreased gene flow and increased inbreeding levels. This will in turn lead to lowered levels of genetic diversity, negatively affecting the growth and survival of populations. At the same time, populations at the species’ distribution edge, like those in this study, could possess important stores of genetic diversity and adaptive potential, while also being vulnerable to climatic or anthropogenic changes. A survey of the level and distribution of genetic variation and identification of potentially adaptive genes is needed since adaptive genetic variation is essential for their long-term survival. Oaks possess a remarkable characteristic in that they maintain their species identity and specific environmental adaptations despite their propensity to hybridize. Thus, in the face of interspecific gene flow, some areas of the genome remain differentiated due to selection. This characteristic allows the study of local environmental adaptation through genetic variation analyses. Furthermore, using genic markers with known putative functions makes it possible to link those differentiated markers to potential adaptive traits (e.g., flowering time, drought stress tolerance). Demographic processes like gene flow and genetic drift also play an important role in how genes (including adaptive genes) are maintained or spread. These processes are influenced by disturbances, both natural and anthropogenic. An examination of how genetic variation is geographically distributed can display how these genetic processes and geographical disturbances influence genetic variation patterns. For example, the spatial clustering of closely related trees could promote inbreeding with associated negative effects (inbreeding depression), if gene flow is limited. In turn this can have negative consequences for a species’ ability to adapt to changing environmental conditions. In contrast, interspecific hybridization may also allow the transfer of genes between species that increase their adaptive potential in a changing environment. I have studied the ecologically divergent, interfertile red oaks, Quercus rubra and Q. ellipsoidalis, to identify genes with potential roles in adaptation to abiotic stress through traits such as drought tolerance and flowering time, and to assess the level and distribution of genetic variation. I found evidence for moderate gene flow between the two species and low interspecific genetic differences at most genetic markers (Lind and Gailing 2013). However, the screening of genic markers with potential roles in phenology and drought tolerance led to the identification of a CONSTANS-like (COL) gene, a candidate gene for flowering time and growth. This marker, located in the coding region of the gene, was highly differentiated between the two species in multiple geographical areas, despite interspecific gene flow, and may play a role in reproductive isolation and adaptive divergence between the two species (Lind-Riehl et al. 2014). Since climate change could result in a northward migration of trees species like oaks, this gene could be important in maintaining species identity despite increased contact zones between species (e.g., increased gene flow). Finally I examined differences in spatial genetic structure (SGS) and genetic variation between species and populations subjected to different management strategies and natural disturbances. Diverse management activities combined with various natural disturbances as well as species specific life history traits influenced SGS patterns and inbreeding levels (Lind-Riehl and Gailing submitted).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lake sturgeon (Acipenser fulvescens) were historically abundant in the Huron-Erie Corridor (HEC), a 160 km river/channel network composed of the St. Clair River, Lake St. Clair, and the Detroit River that connects Lake Huron to Lake Erie. In the HEC, most natural lake sturgeon spawning substrates have been eliminated or degraded as a result of channelization and dredging. To address significant habitat loss in HEC, multi-agency restoration efforts are underway to restore spawning substrate by constructing artificial spawning reefs. The main objective of this study was to conduct post-construction monitoring of lake sturgeon egg deposition and larval emergence near two of these artificial reef projects; Fighting Island Reef in the Detroit River, and Middle Channel Spawning Reef in the lower St. Clair River. We also investigated seasonal and nightly timing of larval emergence, growth, and vertical distribution in the water column at these sites, and an additional site in the St. Clair River where lake sturgeon are known to spawn on a bed of ~100 year old coal clinkers. From 2010-12, we collected viable eggs and larvae at all three sites indicating that these artificial reefs are creating conditions suitable for egg deposition, fertilization, incubation, and larval emergence. The construction methods and materials, and physical site conditions present in HEC artificial reef projects can be used to inform future spawning habitat restoration or enhancement efforts. The results from this study have also identified the likelihood of additional uncharacterized natural spawning sites in the St. Clair River. In addition to the field study, we conducted a laboratory experiment involving actual substrate materials that have been used in artificial reef construction in this system. Although coal clinkers are chemically inert, some trace elements can be reincorporated with the clinker material during the combustion process. Since lake sturgeon eggs and larvae are developing in close proximity to this material, it is important to measure the concentration of potentially toxic trace elements. This study focused on arsenic, which occurs naturally in coal and can be toxic to fishes. Total arsenic concentration was measured in samples taken from four substrate treatments submerged in distilled water; limestone cobble, rinsed limestone cobble, coal clinker, and rinsed coal clinker. Samples were taken at three time intervals: 24 hours, 11 days, and 21 days. ICP-MS analysis showed that concentrations of total arsenic were below the EPA drinking water standard (10 ppb) for all samples. However, at the 24 hour sampling interval, a two way repeated measures ANOVA with a Holm-Sidak post hoc analysis (α= 0.05) showed that the mean arsenic concentration was significantly higher in the coal clinker substrate treatment then in the rinsed coal clinker treatment (p=0.006), the limestone cobble treatment (p

Relevância:

10.00% 10.00%

Publicador:

Resumo:

When underground mines close they often fill with water from ground and surface sources; each mine can contain millions to billions of gallons of water. This water, heated by the Earth’s geothermal energy, reaches temperatures ideal for heat pumps. The sheer scale of these flooded underground mines presents a unique opportunity for large scale geothermal heat pump setups which would not be as economically, socially, and environmentally feasible anywhere else. A literature search revealed approximately 30 instances of flooded underground mines being used to heat and cool buildings worldwide. With thousands of closed/abandoned underground mines in the U.S. and a million estimated globally, why hasn’t this opportunity been more widely adopted? This project has found perception and lack of knowledge about the feasibility to be key barriers. To address these issues, this project drafted a guidebook for former mining communities titled A Community Guide to Mine Water Geothermal Heating and Cooling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A low cost electrophoretic deposition (EPD) process was successfully used for liquid metal thin film deposition with a high depositing rate of 0.6 µ/min. Furthermore, silicon nano-powder and liquid metal were then simultaneously deposited as the negative electrode of lithium-ion battery by a technology called co-EPD. The liquid metal was hoping to act as the matrix for silicon particles during lithium ion insertion and distraction. Half-cell testing was performed using as prepared co-EPD sample. An initial discharge capacity of 1500 mAh/g was reported for nano-silicon and galinstan electrode, although the capacity fading issue of these samples was also observed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Traditionally, densities of newly built roadways are checked by direct sampling (cores) or by nuclear density gauge measurements. For roadway engineers, density of asphalt pavement surfaces is essential to determine pavement quality. Unfortunately, field measurements of density by direct sampling or by nuclear measurement are slow processes. Therefore, I have explored the use of rapidly-deployed ground penetrating radar (GPR) as an alternative means of determining pavement quality. The dielectric constant of pavement surface may be a substructure parameter that correlates with pavement density, and can be used as a proxy when density of asphalt is not known from nuclear or destructive methods. The dielectric constant of the asphalt can be determined using ground penetrating radar (GPR). In order to use GPR for evaluation of road surface quality, the relationship between dielectric constants of asphalt and their densities must be established. Field measurements of GPR were taken at four highway sites in Houghton and Keweenaw Counties, Michigan, where density values were also obtained using nuclear methods in the field. Laboratory studies involved asphalt samples taken from the field sites and samples created in the laboratory. These were tested in various ways, including, density, thickness, and time domain reflectometry (TDR). In the field, GPR data was acquired using a 1000 MHz air-launched unit and a ground-coupled unit at 200 and 500 MHz. The equipment used was owned and operated by the Michigan Department of Transportation (MDOT) and available for this study for a total of four days during summer 2005 and spring 2006. The analysis of the reflected waveforms included “routine” processing for velocity using commercial software and direct evaluation of reflection coefficients to determine a dielectric constant. The dielectric constants computed from velocities do not agree well with those obtained from reflection coefficients. Perhaps due to the limited range of asphalt types studied, no correlation between density and dielectric constant was evident. Laboratory measurements were taken with samples removed from the field and samples created for this study. Samples from the field were studied using TDR, in order to obtain dielectric constant directly, and these correlated well with the estimates made from reflection coefficients. Samples created in the laboratory were measured using 1000 MHz air-launched GPR, and 400 MHz ground-coupled GPR, each under both wet and dry conditions. On the basis of these observations, I conclude that dielectric constant of asphalt can be reliably measured from waveform amplitude analysis of GJPR data, based on the consistent agreement with that obtained in the laboratory using TDR. Because of the uniformity of asphalts studied here, any correlation between dielectric constant and density is not yet apparent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis examines the relationship between oil prices and economic activity, and it attempts to address the question: do increases in oil prices (oil shocks) precede U.S. recessions? This paper also applied macroeconomics, either through the direct use of a macroeconomic point of view or using a combination of mathematical and statistical models. Two mathematical and statistical models are used to determine the ability of oil prices to predict recessions in the United States. First, using the binary cyclical (Bry-Boschan method) indicator procedure to test the turning point of oil prices compared with turning points in GDP finds that oil prices almost always turn five month before a recession, suggesting that an oil shock might occur before a recession. Second, the Granger causality test shows that oil prices change do Granger cause U.S. recessions, indicating that oil prices are a useful signal to indicate a U.S. recession. Finally, combining this analysis with the literature, there are several potential explanations that the spike in oil prices result in slower GDP growth and are a contributing factor to U.S. recessions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this thesis we study weak isometries of Hamming spaces. These are permutations of a Hamming space that preserve some but not necessarily all distances. We wish to find conditions under which a weak isometry is in fact an isometry. This type of problem was first posed by Beckman and Quarles for Rn. In chapter 2 we give definitions pertinent to our research. The 3rd chapter focuses on some known results in this area with special emphasis on papers by V. Krasin as well as S. De Winter and M. Korb who solved this problem for the Boolean cube, that is, the binary Hamming space. We attempted to generalize some of their methods to the non-boolean case. The 4th chapter has our new results and is split into two major contributions. Our first contribution shows if n=p or p < n2, then every weak isometry of Hnq that preserves distance p is an isometry. Our second contribution gives a possible method to check if a weak isometry is an isometry using linear algebra and graph theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

By providing vehicle-to-vehicle and vehicle-to-infrastructure wireless communications, vehicular ad hoc networks (VANETs), also known as the “networks on wheels”, can greatly enhance traffic safety, traffic efficiency and driving experience for intelligent transportation system (ITS). However, the unique features of VANETs, such as high mobility and uneven distribution of vehicular nodes, impose critical challenges of high efficiency and reliability for the implementation of VANETs. This dissertation is motivated by the great application potentials of VANETs in the design of efficient in-network data processing and dissemination. Considering the significance of message aggregation, data dissemination and data collection, this dissertation research targets at enhancing the traffic safety and traffic efficiency, as well as developing novel commercial applications, based on VANETs, following four aspects: 1) accurate and efficient message aggregation to detect on-road safety relevant events, 2) reliable data dissemination to reliably notify remote vehicles, 3) efficient and reliable spatial data collection from vehicular sensors, and 4) novel promising applications to exploit the commercial potentials of VANETs. Specifically, to enable cooperative detection of safety relevant events on the roads, the structure-less message aggregation (SLMA) scheme is proposed to improve communication efficiency and message accuracy. The scheme of relative position based message dissemination (RPB-MD) is proposed to reliably and efficiently disseminate messages to all intended vehicles in the zone-of-relevance in varying traffic density. Due to numerous vehicular sensor data available based on VANETs, the scheme of compressive sampling based data collection (CS-DC) is proposed to efficiently collect the spatial relevance data in a large scale, especially in the dense traffic. In addition, with novel and efficient solutions proposed for the application specific issues of data dissemination and data collection, several appealing value-added applications for VANETs are developed to exploit the commercial potentials of VANETs, namely general purpose automatic survey (GPAS), VANET-based ambient ad dissemination (VAAD) and VANET based vehicle performance monitoring and analysis (VehicleView). Thus, by improving the efficiency and reliability in in-network data processing and dissemination, including message aggregation, data dissemination and data collection, together with the development of novel promising applications, this dissertation will help push VANETs further to the stage of massive deployment.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

It is remarkable that there are no deployed military hybrid vehicles since battlefield fuel is approximately 100 times the cost of civilian fuel. In the commercial marketplace, where fuel prices are much lower, electric hybrid vehicles have become increasingly common due to their increased fuel efficiency and the associated operating cost benefit. An absence of military hybrid vehicles is not due to a lack of investment in research and development, but rather because applying hybrid vehicle architectures to a military application has unique challenges. These challenges include inconsistent duty cycles for propulsion requirements and the absence of methods to look at vehicle energy in a holistic sense. This dissertation provides a remedy to these challenges by presenting a method to quantify the benefits of a military hybrid vehicle by regarding that vehicle as a microgrid. This innovative concept allowed for the creation of an expandable multiple input numerical optimization method that was implemented for both real-time control and system design optimization. An example of each of these implementations was presented. Optimization in the loop using this new method was compared to a traditional closed loop control system and proved to be more fuel efficient. System design optimization using this method successfully illustrated battery size optimization by iterating through various electric duty cycles. By utilizing this new multiple input numerical optimization method, a holistic view of duty cycle synthesis, vehicle energy use, and vehicle design optimization can be achieved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Aluminum alloyed with small atomic fractions of Sc, Zr, and Hf has been shown to exhibit high temperature microstructural stability that may improve high temperature mechanical behavior. These quaternary alloys were designed using thermodynamic modeling to increase the volume fraction of precipitated tri-aluminide phases to improve thermal stability. When aged during a multi-step, isochronal heat treatment, two compositions showed a secondary room-temperature hardness peak up to 700 MPa at 450°C. Elevated temperature hardness profiles also indicated an increase in hardness from 200-300°C, attributed to the precipitation of Al3Sc, however, no secondary hardness response was observed from the Al3Zr or Al3Hf phases in this alloy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

I utilized state the art remote sensing and GIS (Geographical Information System) techniques to study large scale biological, physical and ecological processes of coastal, nearshore, and offshore waters of Lake Michigan and Lake Superior. These processes ranged from chlorophyll a and primary production time series analysies in Lake Michigan to coastal stamp sand threats on Buffalo Reef in Lake Superior. I used SeaWiFS (Sea-viewing Wide Field-of-view Sensor) satellite imagery to trace various biological, chemical and optical water properties of Lake Michigan during the past decade and to investigate the collapse of early spring primary production. Using spatial analysis techniques, I was able to connect these changes to some important biological processes of the lake (quagga mussels filtration). In a separate study on Lake Superior, using LiDAR (Light Detection and Ranging) and aerial photos, we examined natural coastal erosion in Grand Traverse Bay, Michigan, and discussed a variety of geological features that influence general sediment accumulation patterns and interactions with migrating tailings from legacy mining. These sediments are moving southwesterly towards Buffalo Reef, creating a threat to the lake trout and lake whitefish breeding ground.