7 resultados para 650200 Mining and Extraction

em Digital Commons - Michigan Tech


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Bioplastics are polymers (such as polyesters) produced from bacterial fermentations that are biodegradable and nonhazardous. They are produced by a wide variety of bacteria and are made only when stress conditions allow, such as when nutrient levels are low, more specifically levels of nitrogen and oxygen. These stress conditions cause certain bacteria to build up excess carbon deposits as energy reserves in the form of polyhydroxyalkanoates (PHAs). PHAs can be extracted and formed into actual plastic with the same strength of conventional, synthetic-based plastics without the need to rely on foreign petroleum. The overall goal of this project was to select for a bacteria that could grow on sugars found in the lignocellulosic biomass, and get the bacteria to produce PHAs and peptidoglycan. Once this was accomplished the goal was to extract PHAs and peptidoglycan in order to make a stronger more rigid plastic, by combing them into a co-polymer. The individual goals of this project were to: (1) Select and screen bacteria that are capable of producing PHAs by utilizing the carbon/energy sources found in lignocellulosic biomass; (2) Maximize the utilization of those sugars present in woody biomass in order to produce optimal levels of PHAs. (3) Use room temperature ionic liquids (RTILs) in order to separate the cell membrane and peptidoglycan, allowing for better extraction of PHAs and more intact peptidoglycan. B. megaterium a Gram-positive PHA-producing bacterium was selected for study in this project. It was grown on a variety of different substrates in order to maximize both its growth and production of PHAs. The optimal conditions were found to be 30°C, pH 6.0 and sugar concentration of either 30g/L glucose or xylose. After optimal growth was obtained, both RTILs and enzymatic treatments were used to break the cell wall, in order to extract the PHAs, and peptidoglycan. PHAs and peptidoglycan were successfully extracted from the cell, and will be used in the future to create a new stronger co-polymer. Peptidoglycan recovery yield was 16% of the cells’ dry weight.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Today sustainable development is a very pertinent issue. Communities do not want companies, specifically mining companies, to deplete a natural resource and leave. The goal is to minimize the negative impacts of mining and the boom/bust cycles of natural resource extraction. In this study a three part framework was developed to analyze the sustainability of the Flambeau Mine in Ladysmith, Wisconsin. The first and second part dealt with an in-depth local and regional analysis and whether the community was developing within its own vision. The third part used nine sustainability measures including: 1. Need Present Generation 2. Future Need 3. Acceptable Legacy 4. Full-Cost 5. Contribution to Economic Development 6. Equity 7. Consent 8. Respect for Ecological Limits, Maintenance of Ecological Integrity and Landscape Requirements 9. Offsetting Restoration This study concluded that the Flambeau Mine was sustainable relative to the first two criteria and that it can be considered mostly sustainable relative to the nine criteria. Overall it can be stated that the Flambeau Mine was a beneficial project to the Ladysmith Wisconsin area. Additionally it appeared to decrease the public’s negative perception of mining. Recommendations for future analytical work are made. Suggestions are made as to how mining companies could increase the potential for the attainment of sustainability in projects. It is recommended that this framework be used by other industries.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation examines the global technological and environmental history of copper smelting and the conflict that developed between historic preservation and environmental remediation at major copper smelting sites in the United States after their productive periods ended. Part I of the dissertation is a synthetic overview of the history of copper smelting and its environmental impact. After reviewing the basic metallurgy of copper ores, the dissertation contains successive chapters on the history of copper smelting to 1640, culminating in the so-called German, or Continental, processing system; on the emergence of the rival Welsh system during the British industrial revolution; and on the growth of American dominance in copper production the late 19th and early 20th centuries. The latter chapter focuses, in particular, on three of the most important early American copper districts: Michigan’s Keweenaw Peninsula, Tennessee’s Copper Basin, and Butte-Anaconda, Montana. As these three districts went into decline and ultimately out of production, they left a rich industrial heritage and significant waste and pollution problems generated by increasingly more sophisticated technologies capable of commercially processing steadily growing volumes of decreasingly rich ores. Part II of the dissertation looks at the conflict between historic preservation and environmental remediation that emerged locally and nationally in copper districts as they went into decline and eventually ceased production. Locally, former copper mining communities often split between those who wished to commemorate a region’s past importance and develop heritage tourism, and local developers who wished to clear up and clean out old industrial sites for other purposes. Nationally, Congress passed laws in the 1960s and 1970s mandating the preservation of historical resources (National Historic Preservation Act) and laws mandating the cleanup of contaminated landscapes (CERCLA, or Superfund), objectives sometimes in conflict – especially in the case of copper smelting sites. The dissertation devotes individual chapters to the conflicts that developed between environmental remediation, particularly involving the Environmental Protection Agency and the heritage movement in the Tennessee, Montana, and Michigan copper districts. A concluding chapter provides a broad model to illustrate the relationship between industrial decline, federal environmental remediation activities, and the growth of heritage consciousness in former copper mining and smelting areas, analyzes why the outcome varied in the three areas, and suggests methods for dealing with heritage-remediation issues to minimize conflict and maximize heritage preservation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Highway infrastructure plays a significant role in society. The building and upkeep of America’s highways provide society the necessary means of transportation for goods and services needed to develop as a nation. However, as a result of economic and social development, vast amounts of greenhouse gas emissions (GHG) are emitted into the atmosphere contributing to global climate change. In recognizing this, future policies may mandate the monitoring of GHG emissions from public agencies and private industries in order to reduce the effects of global climate change. To effectively reduce these emissions, there must be methods that agencies can use to quantify the GHG emissions associated with constructing and maintaining the nation’s highway infrastructure. Current methods for assessing the impacts of highway infrastructure include methodologies that look at the economic impacts (costs) of constructing and maintaining highway infrastructure over its life cycle. This is known as Life Cycle Cost Analysis (LCCA). With the recognition of global climate change, transportation agencies and contractors are also investigating the environmental impacts that are associated with highway infrastructure construction and rehabilitation. A common tool in doing so is the use of Life Cycle Assessment (LCA). Traditionally, LCA is used to assess the environmental impacts of products or processes. LCA is an emerging concept in highway infrastructure assessment and is now being implemented and applied to transportation systems. This research focuses on life cycle GHG emissions associated with the construction and rehabilitation of highway infrastructure using a LCA approach. Life cycle phases of the highway section include; the material acquisition and extraction, construction and rehabilitation, and service phases. Departing from traditional approaches that tend to use LCA as a way to compare alternative pavement materials or designs based on estimated inventories, this research proposes a shift to a context sensitive process-based approach that uses actual observed construction and performance data to calculate greenhouse gas emissions associated with highway construction and rehabilitation. The goal is to support strategies that reduce long-term environmental impacts. Ultimately, this thesis outlines techniques that can be used to assess GHG emissions associated with construction and rehabilitation operations to support the overall pavement LCA.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

High horizontal stresses can cause numerous ground control problems in mines and other underground structures ultimately impacting worker safety, productivity and the economics of an underground operation. Mine layout and design can be optimized when the presence and orientation of these stresses are recognized and their impact minimized. A simple technique for correlating the principal horizontal stress direction in a sedimentary rock mass with the preferential orientation of moisture induced expansion in a sample of the same rock was introduced in the 1970s and has since gone un-reported and unused. This procedure was reexamined at a locality near the original test site at White Pine, Michigan in order to validate the original research and to consider its usefulness in mining and civil engineering applications in high horizontal stress conditions. This procedure may also be useful as an economical means for characterizing regional stress fields.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Synthetic oligonucleotides and peptides have found wide applications in industry and academic research labs. There are ~60 peptide drugs on the market and over 500 under development. The global annual sale of peptide drugs in 2010 was estimated to be $13 billion. There are three oligonucleotide-based drugs on market; among them, the FDA newly approved Kynamro was predicted to have a $100 million annual sale. The annual sale of oligonucleotides to academic labs was estimated to be $700 million. Both bio-oligomers are mostly synthesized on automated synthesizers using solid phase synthesis technology, in which nucleoside or amino acid monomers are added sequentially until the desired full-length sequence is reached. The additions cannot be complete, which generates truncated undesired failure sequences. For almost all applications, these impurities must be removed. The most widely used method is HPLC. However, the method is slow, expensive, labor-intensive, not amendable for automation, difficult to scale up, and unsuitable for high throughput purification. It needs large capital investment, and consumes large volumes of harmful solvents. The purification costs are estimated to be more than 50% of total production costs. Other methods for bio-oligomer purification also have drawbacks, and are less favored than HPLC for most applications. To overcome the problems of known biopolymer purification technologies, we have developed two non-chromatographic purification methods. They are (1) catching failure sequences by polymerization, and (2) catching full-length sequences by polymerization. In the first method, a polymerizable group is attached to the failure sequences of the bio-oligomers during automated synthesis; purification is achieved by simply polymerizing the failure sequences into an insoluble gel and extracting full-length sequences. In the second method, a polymerizable group is attached to the full-length sequences, which are then incorporated into a polymer; impurities are removed by washing, and pure product is cleaved from polymer. These methods do not need chromatography, and all drawbacks of HPLC no longer exist. Using them, purification is achieved by simple manipulations such as shaking and extraction. Therefore, they are suitable for large scale purification of oligonucleotide and peptide drugs, and also ideal for high throughput purification, which currently has a high demand for research projects involving total gene synthesis. The dissertation will present the details about the development of the techniques. Chapter 1 will make an introduction to oligodeoxynucleotides (ODNs), their synthesis and purification. Chapter 2 will describe the detailed studies of using the catching failure sequences by polymerization method to purify ODNs. Chapter 3 will describe the further optimization of the catching failure sequences by polymerization ODN purification technology to the level of practical use. Chapter 4 will present using the catching full-length sequence by polymerization method for ODN purification using acid-cleavable linker. Chapter 5 will make an introduction to peptides, their synthesis and purification. Chapter 6 will describe the studies using the catching full-length sequence by polymerization method for peptide purification.