9 resultados para computation- and data-intensive applications
em Digital Commons - Michigan Tech
Resumo:
The demands in production and associate costs at power generation through non renewable resources are increasing at an alarming rate. Solar energy is one of the renewable resource that has the potential to minimize this increase. Utilization of solar energy have been concentrated mainly on heating application. The use of solar energy in cooling systems in building would benefit greatly achieving the goal of non-renewable energy minimization. The approaches of solar energy heating system research done by initiation such as University of Wisconsin at Madison and building heat flow model research conducted by Oklahoma State University can be used to develop and optimize solar cooling building system. The research uses two approaches to develop a Graphical User Interface (GUI) software for an integrated solar absorption cooling building model, which is capable of simulating and optimizing the absorption cooling system using solar energy as the main energy source to drive the cycle. The software was then put through a number of litmus test to verify its integrity. The litmus test was conducted on various building cooling system data sets of similar applications around the world. The output obtained from the software developed were identical with established experimental results from the data sets used. Software developed by other research are catered for advanced users. The software developed by this research is not only reliable in its code integrity but also through its integrated approach which is catered for new entry users. Hence, this dissertation aims to correctly model a complete building with the absorption cooling system in appropriate climate as a cost effective alternative to conventional vapor compression system.
Resumo:
The primary challenge in groundwater and contaminant transport modeling is obtaining the data needed for constructing, calibrating and testing the models. Large amounts of data are necessary for describing the hydrostratigraphy in areas with complex geology. Increasingly states are making spatial data available that can be used for input to groundwater flow models. The appropriateness of this data for large-scale flow systems has not been tested. This study focuses on modeling a plume of 1,4-dioxane in a heterogeneous aquifer system in Scio Township, Washtenaw County, Michigan. The analysis consisted of: (1) characterization of hydrogeology of the area and construction of a conceptual model based on publicly available spatial data, (2) development and calibration of a regional flow model for the site, (3) conversion of the regional model to a more highly resolved local model, (4) simulation of the dioxane plume, and (5) evaluation of the model's ability to simulate field data and estimation of the possible dioxane sources and subsequent migration until maximum concentrations are at or below the Michigan Department of Environmental Quality's residential cleanup standard for groundwater (85 ppb). MODFLOW-2000 and MT3D programs were utilized to simulate the groundwater flow and the development and movement of the 1, 4-dioxane plume, respectively. MODFLOW simulates transient groundwater flow in a quasi-3-dimensional sense, subject to a variety of boundary conditions that can simulate recharge, pumping, and surface-/groundwater interactions. MT3D simulates solute advection with groundwater flow (using the flow solution from MODFLOW), dispersion, source/sink mixing, and chemical reaction of contaminants. This modeling approach was successful at simulating the groundwater flows by calibrating recharge and hydraulic conductivities. The plume transport was adequately simulated using literature dispersivity and sorption coefficients, although the plume geometries were not well constrained.
Resumo:
Sustainable yields from water wells in hard-rock aquifers are achieved when the well bore intersects fracture networks. Fracture networks are often not readily discernable at the surface. Lineament analysis using remotely sensed satellite imagery has been employed to identify surface expressions of fracturing, and a variety of image-analysis techniques have been successfully applied in “ideal” settings. An ideal setting for lineament detection is where the influences of human development, vegetation, and climatic situations are minimal and hydrogeological conditions and geologic structure are known. There is not yet a well-accepted protocol for mapping lineaments nor have different approaches been compared in non-ideal settings. A new approach for image-processing/synthesis was developed to identify successful satellite imagery types for lineament analysis in non-ideal terrain. Four satellite sensors (ASTER, Landsat7 ETM+, QuickBird, RADARSAT-1) and a digital elevation model were evaluated for lineament analysis in Boaco, Nicaragua, where the landscape is subject to varied vegetative cover, a plethora of anthropogenic features, and frequent cloud cover that limit the availability of optical satellite data. A variety of digital image processing techniques were employed and lineament interpretations were performed to obtain 12 complementary image products that were evaluated subjectively to identify lineaments. The 12 lineament interpretations were synthesized to create a raster image of lineament zone coincidence that shows the level of agreement among the 12 interpretations. A composite lineament interpretation was made using the coincidence raster to restrict lineament observations to areas where multiple interpretations (at least 4) agree. Nine of the 11 previously mapped faults were identified from the coincidence raster. An additional 26 lineaments were identified from the coincidence raster, and the locations of 10 were confirmed by field observation. Four manual pumping tests suggest that well productivity is higher for wells proximal to lineament features. Interpretations from RADARSAT-1 products were superior to interpretations from other sensor products, suggesting that quality lineament interpretation in this region requires anthropogenic features to be minimized and topographic expressions to be maximized. The approach developed in this study has the potential to improve siting wells in non-ideal regions.
Resumo:
By providing vehicle-to-vehicle and vehicle-to-infrastructure wireless communications, vehicular ad hoc networks (VANETs), also known as the “networks on wheels”, can greatly enhance traffic safety, traffic efficiency and driving experience for intelligent transportation system (ITS). However, the unique features of VANETs, such as high mobility and uneven distribution of vehicular nodes, impose critical challenges of high efficiency and reliability for the implementation of VANETs. This dissertation is motivated by the great application potentials of VANETs in the design of efficient in-network data processing and dissemination. Considering the significance of message aggregation, data dissemination and data collection, this dissertation research targets at enhancing the traffic safety and traffic efficiency, as well as developing novel commercial applications, based on VANETs, following four aspects: 1) accurate and efficient message aggregation to detect on-road safety relevant events, 2) reliable data dissemination to reliably notify remote vehicles, 3) efficient and reliable spatial data collection from vehicular sensors, and 4) novel promising applications to exploit the commercial potentials of VANETs. Specifically, to enable cooperative detection of safety relevant events on the roads, the structure-less message aggregation (SLMA) scheme is proposed to improve communication efficiency and message accuracy. The scheme of relative position based message dissemination (RPB-MD) is proposed to reliably and efficiently disseminate messages to all intended vehicles in the zone-of-relevance in varying traffic density. Due to numerous vehicular sensor data available based on VANETs, the scheme of compressive sampling based data collection (CS-DC) is proposed to efficiently collect the spatial relevance data in a large scale, especially in the dense traffic. In addition, with novel and efficient solutions proposed for the application specific issues of data dissemination and data collection, several appealing value-added applications for VANETs are developed to exploit the commercial potentials of VANETs, namely general purpose automatic survey (GPAS), VANET-based ambient ad dissemination (VAAD) and VANET based vehicle performance monitoring and analysis (VehicleView). Thus, by improving the efficiency and reliability in in-network data processing and dissemination, including message aggregation, data dissemination and data collection, together with the development of novel promising applications, this dissertation will help push VANETs further to the stage of massive deployment.
Resumo:
To analyze the characteristics and predict the dynamic behaviors of complex systems over time, comprehensive research to enable the development of systems that can intelligently adapt to the evolving conditions and infer new knowledge with algorithms that are not predesigned is crucially needed. This dissertation research studies the integration of the techniques and methodologies resulted from the fields of pattern recognition, intelligent agents, artificial immune systems, and distributed computing platforms, to create technologies that can more accurately describe and control the dynamics of real-world complex systems. The need for such technologies is emerging in manufacturing, transportation, hazard mitigation, weather and climate prediction, homeland security, and emergency response. Motivated by the ability of mobile agents to dynamically incorporate additional computational and control algorithms into executing applications, mobile agent technology is employed in this research for the adaptive sensing and monitoring in a wireless sensor network. Mobile agents are software components that can travel from one computing platform to another in a network and carry programs and data states that are needed for performing the assigned tasks. To support the generation, migration, communication, and management of mobile monitoring agents, an embeddable mobile agent system (Mobile-C) is integrated with sensor nodes. Mobile monitoring agents visit distributed sensor nodes, read real-time sensor data, and perform anomaly detection using the equipped pattern recognition algorithms. The optimal control of agents is achieved by mimicking the adaptive immune response and the application of multi-objective optimization algorithms. The mobile agent approach provides potential to reduce the communication load and energy consumption in monitoring networks. The major research work of this dissertation project includes: (1) studying effective feature extraction methods for time series measurement data; (2) investigating the impact of the feature extraction methods and dissimilarity measures on the performance of pattern recognition; (3) researching the effects of environmental factors on the performance of pattern recognition; (4) integrating an embeddable mobile agent system with wireless sensor nodes; (5) optimizing agent generation and distribution using artificial immune system concept and multi-objective algorithms; (6) applying mobile agent technology and pattern recognition algorithms for adaptive structural health monitoring and driving cycle pattern recognition; (7) developing a web-based monitoring network to enable the visualization and analysis of real-time sensor data remotely. Techniques and algorithms developed in this dissertation project will contribute to research advances in networked distributed systems operating under changing environments.
Resumo:
Though 3D computer graphics has seen tremendous advancement in the past two decades, most available mechanisms for computer interaction in 3D are high cost and targeted for industry and virtual reality applications. Recent advances in Micro-Electro-Mechanical-System (MEMS) devices have brought forth a variety of new low-cost, low-power, miniature sensors with high accuracy, which are well suited for hand-held devices. In this work a novel design for a 3D computer game controller using inertial sensors is proposed, and a prototype device based on this design is implemented. The design incorporates MEMS accelerometers and gyroscopes from Analog Devices to measure the three components of the acceleration and angular velocity. From these sensor readings, the position and orientation of the hand-held compartment can be calculated using numerical methods. The implemented prototype is utilizes a USB 2.0 compliant interface for power and communication with the host system. A Microchip dsPIC microcontroller is used in the design. This microcontroller integrates the analog to digital converters, the program memory flash, as well as the core processor, on a single integrated circuit. A PC running Microsoft Windows operating system is used as the host machine. Prototype firmware for the microcontroller is developed and tested to establish the communication between the design and the host, and perform the data acquisition and initial filtering of the sensor data. A PC front-end application with a graphical interface is developed to communicate with the device, and allow real-time visualization of the acquired data.
Processing and characterization of PbSnTe-based thermoelectric materials made by mechanical alloying
Resumo:
The research reported in this dissertation investigates the processes required to mechanically alloy Pb1-xSnxTe and AgSbTe2 and a method of combining these two end compounds to result in (y)(AgSbTe2)–(1 - y)(Pb1-xSnxTe) thermoelectric materials for power generation applications. In general, traditional melt processing of these alloys has employed high purity materials that are subjected to time and energy intensive processes that result in highly functional material that is not easily reproducible. This research reports the development of mechanical alloying processes using commercially available 99.9% pure elemental powders in order to provide a basis for the economical production of highly functional thermoelectric materials. Though there have been reports of high and low ZT materials fabricated by both melt alloying and mechanical alloying, the processing-structure-properties-performance relationship connecting how the material is made to its resulting functionality is poorly understood. This is particularly true for mechanically alloyed material, motivating an effort to investigate bulk material within the (y)(AgSbTe2)–(1 - y)(Pb1-xSnx- Te) system using the mechanical alloying method. This research adds to the body of knowledge concerning the way in which mechanical alloying can be used to efficiently produce high ZT thermoelectric materials. The processes required to mechanically alloy elemental powders to form Pb1-xSnxTe and AgSbTe2 and to subsequently consolidate the alloyed powder is described. The composition, phases present in the alloy, volume percent, size and spacing of the phases are reported. The room temperature electronic transport properties of electrical conductivity, carrier concentration and carrier mobility are reported for each alloy and the effect of the presence of any secondary phase on the electronic transport properties is described. An mechanical mixing approach for incorporating the end compounds to result in (y)(AgSbTe2)–(1-y)(Pb1-xSnxTe) is described and when 5 vol.% AgSbTe2 was incorporated was found to form a solid solution with the Pb1-xSnxTe phase. An initial attempt to change the carrier concentration of the Pb1-xSnxTe phase was made by adding excess Te and found that the carrier density of the alloys in this work are not sensitive to excess Te. It has been demonstrated using the processing techniques reported in this research that this material system, when appropriately doped, has the potential to perform as highly functional thermoelectric material.
Resumo:
Current copper based circuit technology is becoming a limiting factor in high speed data transfer applications as processors are improving at a faster rate than are developments to increase on board data transfer. One solution is to utilize optical waveguide technology to overcome these bandwidth and loss restrictions. The use of this technology virtually eliminates the heat and cross-talk loss seen in copper circuitry, while also operating at a higher bandwidth. Transitioning current fabrication techniques from small scale laboratory environments to large scale manufacturing presents significant challenges. Optical-to-electrical connections and out-of-plane coupling are significant hurdles in the advancement of optical interconnects. The main goals of this research are the development of direct write material deposition and patterning tools for the fabrication of waveguide systems on large substrates, and the development of out-of-plane coupler components compatible with standard fiber optic cabling. Combining these elements with standard printed circuit boards allows for the fabrication of fully functional optical-electrical-printed-wiring-boards (OEPWBs). A direct dispense tool was designed, assembled, and characterized for the repeatable dispensing of blanket waveguide layers over a range of thicknesses (25-225 µm), eliminating waste material and affording the ability to utilize large substrates. This tool was used to directly dispense multimode waveguide cores which required no UV definition or development. These cores had circular cross sections and were comparable in optical performance to lithographically fabricated square waveguides. Laser direct writing is a non-contact process that allows for the dynamic UV patterning of waveguide material on large substrates, eliminating the need for high resolution masks. A laser direct write tool was designed, assembled, and characterized for direct write patterning waveguides that were comparable in quality to those produced using standard lithographic practices (0.047 dB/cm loss for laser written waveguides compared to 0.043 dB/cm for lithographic waveguides). Straight waveguides, and waveguide turns were patterned at multimode and single mode sizes, and the process was characterized and documented. Support structures such as angled reflectors and vertical posts were produced, showing the versatility of the laser direct write tool. Commercially available components were implanted into the optical layer for out-of-plane routing of the optical signals. These devices featured spherical lenses on the input and output sides of a total internal reflection (TIR) mirror, as well as alignment pins compatible with standard MT design. Fully functional OEPWBs were fabricated featuring input and output out-of-plane optical signal routing with total optical losses not exceeding 10 dB. These prototypes survived thermal cycling (-40°C to 85°C) and humidity exposure (95±4% humidity), showing minimal degradation in optical performance. Operational failure occurred after environmental aging life testing at 110°C for 216 hours.
Resumo:
The United States of America is making great efforts to transform the renewable and abundant biomass resources into cost-competitive, high-performance biofuels, bioproducts, and biopower. This is the key to increase domestic production of transportation fuels and renewable energy, and reduce greenhouse gas and other pollutant emissions. This dissertation focuses specifically on assessing the life cycle environmental impacts of biofuels and bioenergy produced from renewable feedstocks, such as lignocellulosic biomass, renewable oils and fats. The first part of the dissertation presents the life cycle greenhouse gas (GHG) emissions and energy demands of renewable diesel (RD) and hydroprocessed jet fuels (HRJ). The feedstocks include soybean, camelina, field pennycress, jatropha, algae, tallow and etc. Results show that RD and HRJ produced from these feedstocks reduce GHG emissions by over 50% compared to comparably performing petroleum fuels. Fossil energy requirements are also significantly reduced. The second part of this dissertation discusses the life cycle GHG emissions, energy demands and other environmental aspects of pyrolysis oil as well as pyrolysis oil derived biofuels and bioenergy. The feedstocks include waste materials such as sawmill residues, logging residues, sugarcane bagasse and corn stover, and short rotation forestry feedstocks such as hybrid poplar and willow. These LCA results show that as much as 98% GHG emission savings is possible relative to a petroleum heavy fuel oil. Life cycle GHG savings of 77 to 99% were estimated for power generation from pyrolysis oil combustion relative to fossil fuels combustion for electricity, depending on the biomass feedstock and combustion technologies used. Transportation fuels hydroprocessed from pyrolysis oil show over 60% of GHG reductions compared to petroleum gasoline and diesel. The energy required to produce pyrolysis oil and pyrolysis oil derived biofuels and bioelectricity are mainly from renewable biomass, as opposed to fossil energy. Other environmental benefits include human health, ecosystem quality and fossil resources. The third part of the dissertation addresses the direct land use change (dLUC) impact of forest based biofuels and bioenergy. An intensive harvest of aspen in Michigan is investigated to understand the GHG mitigation with biofuels and bioenergy production. The study shows that the intensive harvest of aspen in MI compared to business as usual (BAU) harvesting can produce 18.5 billion gallons of ethanol to blend with gasoline for the transport sector over the next 250 years, or 32.2 billion gallons of bio-oil by the fast pyrolysis process, which can be combusted to generate electricity or upgraded to gasoline and diesel. Intensive harvesting of these forests can result in carbon loss initially in the aspen forest, but eventually accumulates more carbon in the ecosystem, which translates to a CO2 credit from the dLUC impact. Time required for the forest-based biofuels to reach carbon neutrality is approximately 60 years. The last part of the dissertation describes the use of depolymerization model as a tool to understand the kinetic behavior of hemicellulose hydrolysis under dilute acid conditions. Experiments are carried out to measure the concentrations of xylose and xylooligomers during dilute acid hydrolysis of aspen. The experiment data are used to fine tune the parameters of the depolymerization model. The results show that the depolymerization model successfully predicts the xylose monomer profile in the reaction, however, it overestimates the concentrations of xylooligomers.