820 resultados para Engineering teaching process
Resumo:
As the demand for miniature products and components continues to increase, the need for manufacturing processes to provide these products and components has also increased. To meet this need, successful macroscale processes are being scaled down and applied at the microscale. Unfortunately, many challenges have been experienced when directly scaling down macro processes. Initially, frictional effects were believed to be the largest challenge encountered. However, in recent studies it has been found that the greatest challenge encountered has been with size effects. Size effect is a broad term that largely refers to the thickness of the material being formed and how this thickness directly affects the product dimensions and manufacturability. At the microscale, the thickness becomes critical due to the reduced number of grains. When surface contact between the forming tools and the material blanks occur at the macroscale, there is enough material (hundreds of layers of material grains) across the blank thickness to compensate for material flow and the effect of grain orientation. At the microscale, there may be under 10 grains across the blank thickness. With a decreased amount of grains across the thickness, the influence of the grain size, shape and orientation is significant. Any material defects (either natural occurring or ones that occur as a result of the material preparation) have a significant role in altering the forming potential. To date, various micro metal forming and micro materials testing equipment setups have been constructed at the Michigan Tech lab. Initially, the research focus was to create a micro deep drawing setup to potentially build micro sensor encapsulation housings. The research focus shifted to micro metal materials testing equipment setups. These include the construction and testing of the following setups: a micro mechanical bulge test, a micro sheet tension test (testing micro tensile bars), a micro strain analysis (with the use of optical lithography and chemical etching) and a micro sheet hydroforming bulge test. Recently, the focus has shifted to study a micro tube hydroforming process. The intent is to target fuel cells, medical, and sensor encapsulation applications. While the tube hydroforming process is widely understood at the macroscale, the microscale process also offers some significant challenges in terms of size effects. Current work is being conducted in applying direct current to enhance micro tube hydroforming formability. Initially, adding direct current to various metal forming operations has shown some phenomenal results. The focus of current research is to determine the validity of this process.
Resumo:
Waste effluents from the forest products industry are sources of lignocellulosic biomass that can be converted to ethanol by yeast after pretreatment. However, the challenge of improving ethanol yields from a mixed pentose and hexose fermentation of a potentially inhibitory hydrolysate still remains. Hardboard manufacturing process wastewater (HPW) was evaluated at a potential feedstream for lignocellulosic ethanol production by native xylose-fermenting yeast. After screening of xylose-fermenting yeasts, Scheffersomyces stipitis CBS 6054 was selected as the ideal organism for conversion of the HPW hydrolysate material. The individual and synergistic effects of inhibitory compounds present in the hydrolysate were evaluated using response surface methodology. It was concluded that organic acids have an additive negative effect on fermentations. Fermentation conditions were also optimized in terms of aeration and pH. Methods for improving productivity and achieving higher ethanol yields were investigated. Adaptation to the conditions present in the hydrolysate through repeated cell sub-culturing was used. The objectives of this present study were to adapt S. stipitis CBS6054 to a dilute-acid pretreated lignocellulosic containing waste stream; compare the physiological, metabolic, and proteomic profiles of the adapted strain to its parent; quantify changes in protein expression/regulation, metabolite abundance, and enzyme activity; and determine the biochemical and molecular mechanism of adaptation. The adapted culture showed improvement in both substrate utilization and ethanol yields compared to the unadapted parent strain. The adapted strain also represented a growth phenotype compared to its unadapted parent based on its physiological and proteomic profiles. Several potential targets that could be responsible for strain improvement were identified. These targets could have implications for metabolic engineering of strains for improved ethanol production from lignocellulosic feedstocks. Although this work focuses specifically on the conversion of HPW to ethanol, the methods developed can be used for any feedstock/product systems that employ a microbial conversion step. The benefit of this research is that the organisms will the optimized for a company's specific system.
Resumo:
This report shares my efforts in developing a solid unit of instruction that has a clear focus on student outcomes. I have been a teacher for 20 years and have been writing and revising curricula for much of that time. However, most has been developed without the benefit of current research on how students learn and did not focus on what and how students are learning. My journey as a teacher has involved a lot of trial and error. My traditional method of teaching is to look at the benchmarks (now content expectations) to see what needs to be covered. My unit consists of having students read the appropriate sections in the textbook, complete work sheets, watch a video, and take some notes. I try to include at least one hands-on activity, one or more quizzes, and the traditional end-of-unit test consisting mostly of multiple choice questions I find in the textbook. I try to be engaging, make the lessons fun, and hope that at the end of the unit my students get whatever concepts I‘ve presented so that we can move on to the next topic. I want to increase students‘ understanding of science concepts and their ability to connect understanding to the real-world. However, sometimes I feel that my lessons are missing something. For a long time I have wanted to develop a unit of instruction that I know is an effective tool for the teaching and learning of science. In this report, I describe my efforts to reform my curricula using the “Understanding by Design” process. I want to see if this style of curriculum design will help me be a more effective teacher and if it will lead to an increase in student learning. My hypothesis is that this new (for me) approach to teaching will lead to increased understanding of science concepts among students because it is based on purposefully thinking about learning targets based on “big ideas” in science. For my reformed curricula I incorporate lessons from several outstanding programs I‘ve been involved with including EpiCenter (Purdue University), Incorporated Research Institutions for Seismology (IRIS), the Master of Science Program in Applied Science Education at Michigan Technological University, and the Michigan Association for Computer Users in Learning (MACUL). In this report, I present the methodology on how I developed a new unit of instruction based on the Understanding by Design process. I present several lessons and learning plans I‘ve developed for the unit that follow the 5E Learning Cycle as appendices at the end of this report. I also include the results of pilot testing of one of lessons. Although the lesson I pilot-tested was not as successful in increasing student learning outcomes as I had anticipated, the development process I followed was helpful in that it required me to focus on important concepts. Conducting the pilot test was also helpful to me because it led me to identify ways in which I could improve upon the lesson in the future.
Resumo:
Polycarbonate (PC) is an important engineering thermoplastic that is currently produced in large industrial scale using bisphenol A and monomers such as phosgene. Since phosgene is highly toxic, a non-phosgene approach using diphenyl carbonate (DPC) as an alternative monomer, as developed by Asahi Corporation of Japan, is a significantly more environmentally friendly alternative. Other advantages include the use of CO2 instead of CO as raw material and the elimination of major waste water production. However, for the production of DPC to be economically viable, reactive-distillation units are needed to obtain the necessary yields by shifting the reaction-equilibrium to the desired products and separating the products at the point where the equilibrium reaction occurs. In the field of chemical reaction engineering, there are many reactions that are suffering from the low equilibrium constant. The main goal of this research is to determine the optimal process needed to shift the reactions by using appropriate control strategies of the reactive distillation system. An extensive dynamic mathematical model has been developed to help us investigate different control and processing strategies of the reactive distillation units to increase the production of DPC. The high-fidelity dynamic models include extensive thermodynamic and reaction-kinetics models while incorporating the necessary mass and energy balance of the various stages of the reactive distillation units. The study presented in this document shows the possibility of producing DPC via one reactive distillation instead of the conventional two-column, with a production rate of 16.75 tons/h corresponding to start reactants materials of 74.69 tons/h of Phenol and 35.75 tons/h of Dimethyl Carbonate. This represents a threefold increase over the projected production rate given in the literature based on a two-column configuration. In addition, the purity of the DPC produced could reach levels as high as 99.5% with the effective use of controls. These studies are based on simulation done using high-fidelity dynamic models.
Resumo:
Attempts to strengthen a chromium-modified titanium trialuminide by a combination of grain size refinement and dispersoid strengthening led to a new means to synthesize such materials. This Reactive Mechanical Alloying/Milling process uses in situ reactions between the metallic powders and elements from a process control agent and/or a gaseous environment to assemble a dispersed small hard particle phase within the matrix by a bottom-up approach. In the current research milled powders of the trialuminide alloy along with titanium carbide were produced. The amount of the carbide can be varied widely with simple processing changes and in this case the milling process created trialuminide grain sizes and carbide particles that are the smallest known from such a process. Characterization of these materials required the development of x-ray diffraction means to determine particle sizes by deconvoluting and synthesizing components of the complex multiphase diffraction patterns and to carry out whole pattern analysis to analyze the diffuse scattering that developed from larger than usual highly defective grain boundary regions. These identified regions provide an important mass transport capability in the processing and not only facilitate the alloy development, but add to the understanding of the mechanical alloying process. Consolidation of the milled powder that consisted of small crystallites of the alloy and dispersed carbide particles two nanometers in size formed a unique, somewhat coarsened, microstructure producing an ultra-high strength solid material composed of the chromium-modified titanium trialuminide alloy matrix with small platelets of the complex carbides Ti2AlC and Ti3AlC2. This synthesis process provides the unique ability to nano-engineer a wide variety of composite materials, or special alloys, and has shown the ability to be extended to a wide variety of metallic materials.
Resumo:
In many deposits of silver ores the grade of the ore decreases considerably a few hundred feet below the surface. It is believed that in many cases the better ores owe their richness in part to the process of sulphide enrichment. It is recognized, however, that many rich silver ores are hypogene deposits that have been affected very little, if any, by processes of enrichment.
Resumo:
This work presents a 1-D process scale model used to investigate the chemical dynamics and temporal variability of nitrogen oxides (NOx) and ozone (O3) within and above snowpack at Summit, Greenland for March-May 2009 and estimates surface exchange of NOx between the snowpack and surface layer in April-May 2009. The model assumes the surface of snowflakes have a Liquid Like Layer (LLL) where aqueous chemistry occurs and interacts with the interstitial air of the snowpack. Model parameters and initialization are physically and chemically representative of snowpack at Summit, Greenland and model results are compared to measurements of NOx and O3 collected by our group at Summit, Greenland from 2008-2010. The model paired with measurements confirmed the main hypothesis in literature that photolysis of nitrate on the surface of snowflakes is responsible for nitrogen dioxide (NO2) production in the top ~50 cm of the snowpack at solar noon for March – May time periods in 2009. Nighttime peaks of NO2 in the snowpack for April and May were reproduced with aqueous formation of peroxynitric acid (HNO4) in the top ~50 cm of the snowpack with subsequent mass transfer to the gas phase, decomposition to form NO2 at nighttime, and transportation of the NO2 to depths of 2 meters. Modeled production of HNO4 was hindered in March 2009 due to the low production of its precursor, hydroperoxy radical, resulting in underestimation of nighttime NO2 in the snowpack for March 2009. The aqueous reaction of O3 with formic acid was the major sync of O3 in the snowpack for March-May, 2009. Nitrogen monoxide (NO) production in the top ~50 cm of the snowpack is related to the photolysis of NO2, which underrepresents NO in May of 2009. Modeled surface exchange of NOx in April and May are on the order of 1011 molecules m-2 s-1. Removal of measured downward fluxes of NO and NO2 in measured fluxes resulted in agreement between measured NOx fluxes and modeled surface exchange in April and an order of magnitude deviation in May. Modeled transport of NOx above the snowpack in May shows an order of magnitude increase of NOx fluxes in the first 50 cm of the snowpack and is attributed to the production of NO2 during the day from the thermal decomposition and photolysis of peroxynitric acid with minor contributions of NO from HONO photolysis in the early morning.
Resumo:
Many schools do not begin to introduce college students to software engineering until they have had at least one semester of programming. Since software engineering is a large, complex, and abstract subject it is difficult to construct active learning exercises that build on the students’ elementary knowledge of programming and still teach basic software engineering principles. It is also the case that beginning students typically know how to construct small programs, but they have little experience with the techniques necessary to produce reliable and long-term maintainable modules. I have addressed these two concerns by defining a local standard (Montana Tech Method (MTM) Software Development Standard for Small Modules Template) that step-by-step directs students toward the construction of highly reliable small modules using well known, best-practices software engineering techniques. “Small module” is here defined as a coherent development task that can be unit tested, and can be car ried out by a single (or a pair of) software engineer(s) in at most a few weeks. The standard describes the process to be used and also provides a template for the top-level documentation. The instructional module’s sequence of mini-lectures and exercises associated with the use of this (and other) local standards are used throughout the course, which perforce covers more abstract software engineering material using traditional reading and writing assignments. The sequence of mini-lectures and hands-on assignments (many of which are done in small groups) constitutes an instructional module that can be used in any similar software engineering course.
Resumo:
Two librarians at a small STEM academic library have partnered with professors to develop and teach chemistry and writing courses. These librarians have successfully worked with professors to serve as an active presence within the classroom. This article describes the challenges of navigating the typical obstacles librarians face when attempting to integrate information literacy into the curriculum, reflects on the benefits of these collaborations, and touches on strategies for implementing similar programs at other institutions. It outlines two distinct approaches to collaborating with professors on credit-bearing information literacy courses, along with the key steps involved in planning and implementing these courses, including generating institutional buy-in, identifying potential collaborators, negotiating workload and responsibilities with collaborators, and planning to sustain courses beyond a single academic year. Suggestions for overcoming obstacles, supplemented by experience-based recommendations, are discussed.
Resumo:
Much of the knowledge about software systems is implicit, and therefore difficult to recover by purely automated techniques. Architectural layers and the externally visible features of software systems are two examples of information that can be difficult to detect from source code alone, and that would benefit from additional human knowledge. Typical approaches to reasoning about data involve encoding an explicit meta-model and expressing analyses at that level. Due to its informal nature, however, human knowledge can be difficult to characterize up-front and integrate into such a meta-model. We propose a generic, annotation-based approach to capture such knowledge during the reverse engineering process. Annotation types can be iteratively defined, refined and transformed, without requiring a fixed meta-model to be defined in advance. We show how our approach supports reverse engineering by implementing it in a tool called Metanool and by applying it to (i) analyzing architectural layering, (ii) tracking reengineering tasks, (iii) detecting design flaws, and (iv) analyzing features.
Resumo:
The central question for this paper is how to improve the production process by closing the gap between industrial designers and software engineers of television(TV)-based User Interfaces (UI) in an industrial environment. Software engineers are highly interested whether one UI design can be converted into several fully functional UIs for TV products with different screen properties. The aim of the software engineers is to apply automatic layout and scaling in order to speed up and improve the production process. However, the question is whether a UI design lends itself for such automatic layout and scaling. This is investigated by analysing a prototype UI design done by industrial designers. In a first requirements study, industrial designers had created meta-annotations on top of their UI design in order to disclose their design rationale for discussions with software engineers. In a second study, five (out of ten) industrial designers assessed the potential of four different meta-annotation approaches. The question was which annotation method industrial designers would prefer and whether it could satisfy the technical requirements of the software engineering process. One main result is that the industrial designers preferred the method they were already familiar with, which therefore seems to be the most effective one although the main objective of automatic layout and scaling could still not be achieved.
Resumo:
The use of virtual learning environments in Higher Education (HE) has been growing in Portugal, driven by the Bologna Process. An example is the use of Learning Management Systems (LMS) that translates an opportunity to leverage the use of technological advances in the educational process. The progress of information and communication technologies (ICT) coupled with the great development of Internet has brought significant challenges to educators that require a thorough knowledge of their implementation process. These field notes present the results of a survey among teachers of a private HE institution in its use of Moodle as a tool to support face-to-face teaching. A research methodology essentially of exploratory nature based on a questionnaire survey, supported by statistical treatment allowed to detect motivations, type of use and perceptions of teachers in relation to this kind of tool. The results showed that most teachers, by a narrow margin (58%), had not changed their pedagogical practice as a consequence of using Moodle. Among those that did 67% attended institutional internal training. Some of the results obtained suggest further investigation and provide guidelines to plan future internal training.
Resumo:
Wie lässt sich die Qualität des Lernens, Lehrens und Prüfens durch den Einsatz neuer Medien steigern? Übertragen auf die Komponenten und Bausteine des E-Education-Prozesses heißt das: - Mit welchen digitalen Materialien und Komponenten ist eine effiziente computergestützte Inhaltserschließung möglich? - Mit welcher Organisationsform der Lehre kann ein maximaler Qualitätsgewinn für die traditionelle Präsenzlehre erzielt werden? - Wie lassen sich traditionelle Prüfungsformen durch digitale Medien bereichern und mit technischer Hilfe auswerten? - Wie müssen digitale Inhalte beschaffen sein, um einen Mehrwert für den Lehr- und Lernprozess, möglicherweise in Selbstlernszenarien, zu erzielen? - Wie muss eine Lernplattformaufgebaut sein, um E-Education in ihrer gesamten Breite zu unterstützen und eine hohe Akzeptanz zu erreichen? Die Autoren sind Hauptakteure des Marburger „Linguistik Engineering Teams“, das in sich das gesamte Know-How für die Entwicklung und Nutzung verschiedener Lehr- und Lernszenarien vereinigt: von der Konzeption über die Programmierung bis hin zur Nutzung in allen denkbaren Varianten. Ihr Buch ist ein Leitfaden, der aufzeigt, wie mit einem komplexen E-Education-System nicht nur Qualitäts-, sondern auch Kapazitätsgewinne sowie erhebliche Aufwandsreduktionen erreicht werden können.
Resumo:
ABSTRACT ONTOLOGIES AND METHODS FOR INTEROPERABILITY OF ENGINEERING ANALYSIS MODELS (EAMS) IN AN E-DESIGN ENVIRONMENT SEPTEMBER 2007 NEELIMA KANURI, B.S., BIRLA INSTITUTE OF TECHNOLOGY AND SCIENCES PILANI INDIA M.S., UNIVERSITY OF MASSACHUSETTS AMHERST Directed by: Professor Ian Grosse Interoperability is the ability of two or more systems to exchange and reuse information efficiently. This thesis presents new techniques for interoperating engineering tools using ontologies as the basis for representing, visualizing, reasoning about, and securely exchanging abstract engineering knowledge between software systems. The specific engineering domain that is the primary focus of this report is the modeling knowledge associated with the development of engineering analysis models (EAMs). This abstract modeling knowledge has been used to support integration of analysis and optimization tools in iSIGHT FD , a commercial engineering environment. ANSYS , a commercial FEA tool, has been wrapped as an analysis service available inside of iSIGHT-FD. Engineering analysis modeling (EAM) ontology has been developed and instantiated to form a knowledge base for representing analysis modeling knowledge. The instances of the knowledge base are the analysis models of real world applications. To illustrate how abstract modeling knowledge can be exploited for useful purposes, a cantilever I-Beam design optimization problem has been used as a test bed proof-of-concept application. Two distinct finite element models of the I-beam are available to analyze a given beam design- a beam-element finite element model with potentially lower accuracy but significantly reduced computational costs and a high fidelity, high cost, shell-element finite element model. The goal is to obtain an optimized I-beam design at minimum computational expense. An intelligent KB tool was developed and implemented in FiPER . This tool reasons about the modeling knowledge to intelligently shift between the beam and the shell element models during an optimization process to select the best analysis model for a given optimization design state. In addition to improved interoperability and design optimization, methods are developed and presented that demonstrate the ability to operate on ontological knowledge bases to perform important engineering tasks. One such method is the automatic technical report generation method which converts the modeling knowledge associated with an analysis model to a flat technical report. The second method is a secure knowledge sharing method which allocates permissions to portions of knowledge to control knowledge access and sharing. Both the methods acting together enable recipient specific fine grain controlled knowledge viewing and sharing in an engineering workflow integration environment, such as iSIGHT-FD. These methods together play a very efficient role in reducing the large scale inefficiencies existing in current product design and development cycles due to poor knowledge sharing and reuse between people and software engineering tools. This work is a significant advance in both understanding and application of integration of knowledge in a distributed engineering design framework.
Resumo:
This work presents the proceedings of the twelfth symposium which was held at Kansas State University on April 24, 1982. Since a number of the contributions will be published in detail elsewhere, only brief reports are included here. Some of the reports describe current progress with respect to ongoing projects. Requests for further information should be directed to Dr. Peter Reilly at Iowa State University, Dr. V. G. Murphy at Colorado State University, Dr. Rakesh Bajpai at University of Missouri, Dr. Ed Clausen at University of Arkansas, Dr. L. T. Fan and Dr. L. E. Erickson at Kansas State University. ContentsA Kinetic Analysis of Oleaginous Yeast Fermentation by Candida curvata on Whey Permeate, B.D. Brown and K.H. Hsu, Iowa State University Kinetics of Biofouling in Simulated Water Distribution Systems Using CSTR, T.M. Prakash, University of Missouri Kinetics of Gas Production by C. acetobutylicum, Michael Doremus, Colorado State University Large Scale Production of Methane from Agricultural Residues, O.P. Doyle, G.C. Magruder, E.C. Clausen, and J.L. Gaddy, University of Arkansas The Optimal Process Design for Enzymatic Hydrolysis of Wheat Straw, M.M Gharpuray and L.T. Fan, Kansas State University Extractive Butanol Fermentation, Michael Sierks, Colorado State University Yields Associated with Ethyl Alcohol Production, M.D. Oner, Kansas State University Estimation of Growth Yield and Maintenance Parameters for Microbial Growth on Corn Dust, B.O. Solomon, Kansas State University Milling of Ensiled Corn, Andrzej Neryng, Iowa State University Protein Extraction from Alfalfa, Ravidranath Joshi, Colorado State University Analysis of Disaccharides by Capillary Gas Chromatography, Z.L. Nikolov, Iowa State University Characterization of High Viscosity Fermentations in Tower Fermentors, S.A. Patel and C.H. Lee, Kansas State University Utilization of Sugars in Sorghum Molasses by Clostridium acetobutylicum B. Hong, K.C. Shin, and L.T. Fan, Kansas State University