9 resultados para non-ideal source

em Digital Commons - Michigan Tech


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Internal combustion engines are, and will continue to be, a primary mode of power generation for ground transportation. Challenges exist in meeting fuel consumption regulations and emission standards while upholding performance, as fuel prices rise, and resource depletion and environmental impacts are of increasing concern. Diesel engines are advantageous due to their inherent efficiency advantage over spark ignition engines; however, their NOx and soot emissions can be difficult to control and reduce due to an inherent tradeoff. Diesel combustion is spray and mixing controlled providing an intrinsic link between spray and emissions, motivating detailed, fundamental studies on spray, vaporization, mixing, and combustion characteristics under engine relevant conditions. An optical combustion vessel facility has been developed at Michigan Technological University for these studies, with detailed tests and analysis being conducted. In this combustion vessel facility a preburn procedure for thermodynamic state generation is used, and validated using chemical kinetics modeling both for the MTU vessel, and institutions comprising the Engine Combustion Network international collaborative research initiative. It is shown that minor species produced are representative of modern diesel engines running exhaust gas recirculation and do not impact the autoignition of n-heptane. Diesel spray testing of a high-pressure (2000 bar) multi-hole injector is undertaken including non-vaporizing, vaporizing, and combusting tests, with sprays characterized using Mie back scatter imaging diagnostics. Liquid phase spray parameter trends agree with literature. Fluctuations in liquid length about a quasi-steady value are quantified, along with plume to plume variations. Hypotheses are developed for their causes including fuel pressure fluctuations, nozzle cavitation, internal injector flow and geometry, chamber temperature gradients, and turbulence. These are explored using a mixing limited vaporization model with an equation of state approach for thermopyhysical properties. This model is also applied to single and multi-component surrogates. Results include the development of the combustion research facility and validated thermodynamic state generation procedure. The developed equation of state approach provides application for improving surrogate fuels, both single and multi-component, in terms of diesel spray liquid length, with knowledge of only critical fuel properties. Experimental studies are coupled with modeling incorporating improved thermodynamic non-ideal gas and fuel

Relevância:

100.00% 100.00%

Publicador:

Resumo:

As the agricultural non-point source pollution(ANPSP) has become the most significant threat for water environmental deterioration and lake eutrophication in China, more and more scientists and technologists are focusing on the control countermeasure and pollution mechanism of agricultural non-point source pollution. The unreasonable rural production structure and limited scientific management measures are the main reasons for acute ANSPS problems in China. At present, the problem for pollution control is a lack of specific regulations, which affects the government's management efficiency. According to these characteristics and problems, this paper puts forward some corresponding policies. The status of the agricultural non-point source pollution of China is analyzed, and ANSPS prevention and control model is provided based on governance policy, environmental legislation, technical system and subsidy policy. At last, the case analysis of Qiandao Lake is given, and an economic policy is adopted based on its situation.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Sustainable yields from water wells in hard-rock aquifers are achieved when the well bore intersects fracture networks. Fracture networks are often not readily discernable at the surface. Lineament analysis using remotely sensed satellite imagery has been employed to identify surface expressions of fracturing, and a variety of image-analysis techniques have been successfully applied in “ideal” settings. An ideal setting for lineament detection is where the influences of human development, vegetation, and climatic situations are minimal and hydrogeological conditions and geologic structure are known. There is not yet a well-accepted protocol for mapping lineaments nor have different approaches been compared in non-ideal settings. A new approach for image-processing/synthesis was developed to identify successful satellite imagery types for lineament analysis in non-ideal terrain. Four satellite sensors (ASTER, Landsat7 ETM+, QuickBird, RADARSAT-1) and a digital elevation model were evaluated for lineament analysis in Boaco, Nicaragua, where the landscape is subject to varied vegetative cover, a plethora of anthropogenic features, and frequent cloud cover that limit the availability of optical satellite data. A variety of digital image processing techniques were employed and lineament interpretations were performed to obtain 12 complementary image products that were evaluated subjectively to identify lineaments. The 12 lineament interpretations were synthesized to create a raster image of lineament zone coincidence that shows the level of agreement among the 12 interpretations. A composite lineament interpretation was made using the coincidence raster to restrict lineament observations to areas where multiple interpretations (at least 4) agree. Nine of the 11 previously mapped faults were identified from the coincidence raster. An additional 26 lineaments were identified from the coincidence raster, and the locations of 10 were confirmed by field observation. Four manual pumping tests suggest that well productivity is higher for wells proximal to lineament features. Interpretations from RADARSAT-1 products were superior to interpretations from other sensor products, suggesting that quality lineament interpretation in this region requires anthropogenic features to be minimized and topographic expressions to be maximized. The approach developed in this study has the potential to improve siting wells in non-ideal regions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The single electron transistor (SET) is a charge-based device that may complement the dominant metal-oxide-semiconductor field effect transistor (MOSFET) technology. As the cost of scaling MOSFET to smaller dimensions are rising and the the basic functionality of MOSFET is encountering numerous challenges at dimensions smaller than 10nm, the SET has shown the potential to become the next generation device which operates based on the tunneling of electrons. Since the electron transfer mechanism of a SET device is based on the non-dissipative electron tunneling effect, the power consumption of a SET device is extremely low, estimated to be on the order of 10^-18J. The objectives of this research are to demonstrate technologies that would enable the mass produce of SET devices that are operational at room temperature and to integrate these devices on top of an active complementary-MOSFET (CMOS) substrate. To achieve these goals, two fabrication techniques are considered in this work. The Focus Ion Beam (FIB) technique is used to fabricate the islands and the tunnel junctions of the SET device. A Ultra-Violet (UV) light based Nano-Imprint Lithography (NIL) call Step-and-Flash- Imprint Lithography (SFIL) is used to fabricate the interconnections of the SET devices. Combining these two techniques, a full array of SET devices are fabricated on a planar substrate. Test and characterization of the SET devices has shown consistent Coulomb blockade effect, an important single electron characteristic. To realize a room temperature operational SET device that function as a logic device to work along CMOS, it is important to know the device behavior at different temperatures. Based on the theory developed for a single island SET device, a thermal analysis is carried out on the multi-island SET device and the observation of changes in Coulomb blockade effect is presented. The results show that the multi-island SET device operation highly depends on temperature. The important parameters that determine the SET operation is the effective capacitance Ceff and tunneling resistance Rt . These two parameters lead to the tunneling rate of an electron in the SET device, Γ. To obtain an accurate model for SET operation, the effects of the deviation in dimensions, the trap states in the insulation, and the background charge effect have to be taken into consideration. The theoretical and experimental evidence for these non-ideal effects are presented in this work.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

An electrospray source has been developed using a novel new fluid that is both magnetic and conductive. Unlike conventional electrospray sources that required microfabricated structures to support the fluid to be electrosprayed, this new electrospray fluid utilizes the Rosensweig instability to create the structures in the magnetic fluid when an external magnetic field was applied. Application of an external electric field caused these magnetic fluid structures to spray. These fluid based structures were found to spray at a lower onset voltage than was predicted for electrospray sources with solid structures of similar geometry. These fluid based structures were also found to be resilient to damage, unlike the solid structures found in traditional electrospray sources. Further, experimental studies of magnetic fluids in non-uniform magnetic fields were conducted. The modes of Rosensweig instabilities have been studied in-depth when created by uniform magnetic fields, but little to no studies have been performed on Rosensweig instabilities formed due to non-uniform magnetic fields. The measured spacing of the cone-like structures of ferrofluid, in a non-uniform magnetic field, were found to agree with a proposed theoretical model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mower is a micro-architecture technique which targets branch misprediction penalties in superscalar processors. It speeds-up the misprediction recovery process by dynamically evicting stale instructions and fixing the RAT (Register Alias Table) using explicit branch dependency tracking. Tracking branch dependencies is accomplished by using simple bit matrices. This low-overhead technique allows overlapping of the recovery process with instruction fetching, renaming and scheduling from the correct path. Our evaluation of the mechanism indicates that it yields performance very close to ideal recovery and provides up to 5% speed-up and 2% reduction in power consumption compared to a traditional recovery mechanism using a reorder buffer and a walker. The simplicity of the mechanism should permit easy implementation of Mower in an actual processor.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Synthetic oligonucleotides and peptides have found wide applications in industry and academic research labs. There are ~60 peptide drugs on the market and over 500 under development. The global annual sale of peptide drugs in 2010 was estimated to be $13 billion. There are three oligonucleotide-based drugs on market; among them, the FDA newly approved Kynamro was predicted to have a $100 million annual sale. The annual sale of oligonucleotides to academic labs was estimated to be $700 million. Both bio-oligomers are mostly synthesized on automated synthesizers using solid phase synthesis technology, in which nucleoside or amino acid monomers are added sequentially until the desired full-length sequence is reached. The additions cannot be complete, which generates truncated undesired failure sequences. For almost all applications, these impurities must be removed. The most widely used method is HPLC. However, the method is slow, expensive, labor-intensive, not amendable for automation, difficult to scale up, and unsuitable for high throughput purification. It needs large capital investment, and consumes large volumes of harmful solvents. The purification costs are estimated to be more than 50% of total production costs. Other methods for bio-oligomer purification also have drawbacks, and are less favored than HPLC for most applications. To overcome the problems of known biopolymer purification technologies, we have developed two non-chromatographic purification methods. They are (1) catching failure sequences by polymerization, and (2) catching full-length sequences by polymerization. In the first method, a polymerizable group is attached to the failure sequences of the bio-oligomers during automated synthesis; purification is achieved by simply polymerizing the failure sequences into an insoluble gel and extracting full-length sequences. In the second method, a polymerizable group is attached to the full-length sequences, which are then incorporated into a polymer; impurities are removed by washing, and pure product is cleaved from polymer. These methods do not need chromatography, and all drawbacks of HPLC no longer exist. Using them, purification is achieved by simple manipulations such as shaking and extraction. Therefore, they are suitable for large scale purification of oligonucleotide and peptide drugs, and also ideal for high throughput purification, which currently has a high demand for research projects involving total gene synthesis. The dissertation will present the details about the development of the techniques. Chapter 1 will make an introduction to oligodeoxynucleotides (ODNs), their synthesis and purification. Chapter 2 will describe the detailed studies of using the catching failure sequences by polymerization method to purify ODNs. Chapter 3 will describe the further optimization of the catching failure sequences by polymerization ODN purification technology to the level of practical use. Chapter 4 will present using the catching full-length sequence by polymerization method for ODN purification using acid-cleavable linker. Chapter 5 will make an introduction to peptides, their synthesis and purification. Chapter 6 will describe the studies using the catching full-length sequence by polymerization method for peptide purification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computational models for the investigation of flows in deformable tubes are developed and implemented in the open source computing environment OpenFOAM. Various simulations for Newtonian and non-Newtonian fluids under various flow conditions are carried out and analyzed. First, simulations are performed to investigate the flow of a shear-thinning, non-Newtonian fluid in a collapsed elastic tube and comparisons are made with experimental data. The fluid is modeled by means of the Bird-Carreau viscosity law. The computational domain of the deformed tube is constructed from data obtained via computer tomography imaging. Comparison of the computed velocity fields with the ultrasound Doppler velocity profile measurements show good agreement, as does the adjusted pressure drop along the tube's axis. Analysis of the shear rates show that the shear-thinning effect of the fluid becomes relevant in the cross-sections with the biggest deformation. The peristaltic motion is simulated by means of upper and lower rollers squeezing the fluid along a tube. Two frames of reference are considered. In the moving frame the computational domain is fixed and the coordinate system is moving with the roller speed, and in the fixed frame the roller is represented by a deforming mesh. Several two-dimensional simulations are carried out for Newtonian and non-Newtonian fluids. The effect of the shear-thinning behavior of the fluid on the transport efficiency is examined. In addition, the influence of the roller speed and the gap width between the rollers on the xxvii transport efficiency is discussed. Comparison with experimental data is also presented and different types of moving waves are implemented. In addition, the influence of the roller speed and the gap width between the rollers on the transport efficiency is discussed. Comparison with experimental data is also presented and different types of moving waves are implemented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

By employing interpretive policy analysis this thesis aims to assess, measure, and explain policy capacity for government and non-government organizations involved in reclaiming Alberta's oil sands. Using this type of analysis to assess policy capacity is a novel approach for understanding reclamation policy; and therefore, this research will provide a unique contribution to the literature surrounding reclamation policy. The oil sands region in northeast Alberta, Canada is an area of interest for a few reasons; primarily because of the vast reserves of bitumen and the environmental cost associated with developing this resource. An increase in global oil demand has established incentive for industry to seek out and develop new reserves. Alberta's oil sands are one of the largest remaining reserves in the world, and there is significant interest in increasing production in this region. Furthermore, tensions in several oil exporting nations in the Middle East remain unresolved, and this has garnered additional support for a supply side solution to North American oil demands. This solution relies upon the development of reserves in both the United States and Canada. These compounding factors have contributed to the increased development in the oil sands of northeastern Alberta. Essentially, a rapid expansion of oil sands operations is ongoing, and is the source of significant disturbance across the region. This disturbance, and the promises of reclamation, is a source of contentious debates amongst stakeholders and continues to be highly visible in the media. If oil sands operations are to retain their social license to operate, it is critical that reclamation efforts be effective. One concern non-governmental organizations (NGOs) expressed criticizes the current monitoring and enforcement of regulatory programs in the oil sands. Alberta's NGOs have suggested the data made available to them originates from industrial sources, and is generally unchecked by government. In an effort to discern the overall status of reclamation in the oil sands this study explores several factors essential to policy capacity: work environment, training, employee attitudes, perceived capacity, policy tools, evidence based work, and networking. Data was collected through key informant interviews with senior policy professionals in government and non-government agencies in Alberta. The following are agencies of interest in this research: Canadian Association of Petroleum Producers (CAPP); Alberta Environment and Sustainable Resource Development (AESRD); Alberta Energy Regulator (AER); Cumulative Environmental Management Association (CEMA); Alberta Environment Monitoring, Evaluation, and Reporting Agency (AEMERA); Wood Buffalo Environmental Association (WBEA). The aim of this research is to explain how and why reclamation policy is conducted in Alberta's oil sands. This will illuminate government capacity, NGO capacity, and the interaction of these two agency typologies. In addition to answering research questions, another goal of this project is to show interpretive analysis of policy capacity can be used to measure and predict policy effectiveness. The oil sands of Alberta will be the focus of this project, however, future projects could focus on any government policy scenario utilizing evidence-based approaches.