1000 resultados para Michigan Tech


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this research is to examine the role of the mining company office in the management of the copper industry in Michigan’s Keweenaw Peninsula between 1901 and 1946. Two of the largest and most influential companies were examined – the Calumet & Hecla Mining Company and the Quincy Mining Company. Both companies operated for more than forty years under general managers who were arguably the most influential people in the management of each company. James MacNaughton, general manager at Calumet and Hecla, worked from 1901 through 1941; Charles Lawton, general manager at Quincy Mining Company, worked from 1905 through 1946. In this case, both of these managers were college-educated engineers and adopted scientific management techniques to operate their respective companies. This research focused on two main goals. The first goal of this project was to address the managerial changes in Michigan’s copper mining offices of the early twentieth century. This included the work of MacNaughton and Lawton, along with analysis of the office structures themselves and what changes occurred through time. The second goal of the project was to create a prototype virtual exhibit for use at the Quincy Mining Company office. A virtual exhibit will allow visitors the opportunity to visit the office virtually, experiencing the office as an office worker would have in the early twentieth century. To meet both goals, this project used various research materials, including archival sources, oral histories, and material culture to recreate the history of mining company management in the Copper Country.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Estimating un-measurable states is an important component for onboard diagnostics (OBD) and control strategy development in diesel exhaust aftertreatment systems. This research focuses on the development of an Extended Kalman Filter (EKF) based state estimator for two of the main components in a diesel engine aftertreatment system: the Diesel Oxidation Catalyst (DOC) and the Selective Catalytic Reduction (SCR) catalyst. One of the key areas of interest is the performance of these estimators when the catalyzed particulate filter (CPF) is being actively regenerated. In this study, model reduction techniques were developed and used to develop reduced order models from the 1D models used to simulate the DOC and SCR. As a result of order reduction, the number of states in the estimator is reduced from 12 to 1 per element for the DOC and 12 to 2 per element for the SCR. The reduced order models were simulated on the experimental data and compared to the high fidelity model and the experimental data. The results show that the effect of eliminating the heat transfer and mass transfer coefficients are not significant on the performance of the reduced order models. This is shown by an insignificant change in the kinetic parameters between the reduced order and 1D model for simulating the experimental data. An EKF based estimator to estimate the internal states of the DOC and SCR was developed. The DOC and SCR estimators were simulated on the experimental data to show that the estimator provides improved estimation of states compared to a reduced order model. The results showed that using the temperature measurement at the DOC outlet improved the estimates of the CO , NO , NO2 and HC concentrations from the DOC. The SCR estimator was used to evaluate the effect of NH3 and NOX sensors on state estimation quality. Three sensor combinations of NOX sensor only, NH3 sensor only and both NOX and NH3 sensors were evaluated. The NOX only configuration had the worst performance, the NH3 sensor only configuration was in the middle and both the NOX and NH3 sensor combination provided the best performance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Climate science and climate change are included in the Next Generation Science Standards, curriculum standards that were released in 2013. How to incorporate these topics, especially climate change, has been a difficult task for teachers. A team of scientists are studying aerosols in the free troposphere; what their properties are, how they change while in the atmosphere and where they came from. Lessons were created based on this real, ongoing scientific research being conducted in the Azores. During these activities, students are exposed to what scientists actually do in the form of videos and participate in similar tasks such as conducting experiments, collecting data, and analyzing data. At the conclusion of the lessons, students will form conclusions based on the evidence they have at the time.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Current copper based circuit technology is becoming a limiting factor in high speed data transfer applications as processors are improving at a faster rate than are developments to increase on board data transfer. One solution is to utilize optical waveguide technology to overcome these bandwidth and loss restrictions. The use of this technology virtually eliminates the heat and cross-talk loss seen in copper circuitry, while also operating at a higher bandwidth. Transitioning current fabrication techniques from small scale laboratory environments to large scale manufacturing presents significant challenges. Optical-to-electrical connections and out-of-plane coupling are significant hurdles in the advancement of optical interconnects. The main goals of this research are the development of direct write material deposition and patterning tools for the fabrication of waveguide systems on large substrates, and the development of out-of-plane coupler components compatible with standard fiber optic cabling. Combining these elements with standard printed circuit boards allows for the fabrication of fully functional optical-electrical-printed-wiring-boards (OEPWBs). A direct dispense tool was designed, assembled, and characterized for the repeatable dispensing of blanket waveguide layers over a range of thicknesses (25-225 µm), eliminating waste material and affording the ability to utilize large substrates. This tool was used to directly dispense multimode waveguide cores which required no UV definition or development. These cores had circular cross sections and were comparable in optical performance to lithographically fabricated square waveguides. Laser direct writing is a non-contact process that allows for the dynamic UV patterning of waveguide material on large substrates, eliminating the need for high resolution masks. A laser direct write tool was designed, assembled, and characterized for direct write patterning waveguides that were comparable in quality to those produced using standard lithographic practices (0.047 dB/cm loss for laser written waveguides compared to 0.043 dB/cm for lithographic waveguides). Straight waveguides, and waveguide turns were patterned at multimode and single mode sizes, and the process was characterized and documented. Support structures such as angled reflectors and vertical posts were produced, showing the versatility of the laser direct write tool. Commercially available components were implanted into the optical layer for out-of-plane routing of the optical signals. These devices featured spherical lenses on the input and output sides of a total internal reflection (TIR) mirror, as well as alignment pins compatible with standard MT design. Fully functional OEPWBs were fabricated featuring input and output out-of-plane optical signal routing with total optical losses not exceeding 10 dB. These prototypes survived thermal cycling (-40°C to 85°C) and humidity exposure (95±4% humidity), showing minimal degradation in optical performance. Operational failure occurred after environmental aging life testing at 110°C for 216 hours.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation addresses sustainability of rapid provision of safe water and sanitation required to meet the Millennium Development Goals. Review of health-related literature and global statistics demonstrates engineers' role in achieving the MDGs. This review is followed by analyses relating to social, environmental, and health aspects of meeting MDG targets. Analysis of national indicators showed that inadequate investment, poor or nonexistent policies and governance are challenges to global sanitation coverage in addition to lack of financial resources and gender disparity. Although water availability was not found to be a challenge globally, geospatial analysis demonstrated that water availability is a potentially significant barrier for up to 46 million people living in urban areas and relying on already degraded water resources for environmental income. A daily water balance model incorporating the National Resources Conservation Services curve number method in Bolivian watersheds showed that local water stress is linked to climate change because of reduced recharge. Agricultural expansion in the region slightly exacerbates recharge reductions. Although runoff changes will range from -17% to 14%, recharge rates will decrease under all climate scenarios evaluated (-14% to -27%). Increasing sewer coverage may place stress on the readily accessible natural springs, but increased demand can be sustained if other sources of water supply are developed. This analysis provides a method for hydrological analysis in data scarce regions. Data required for the model were either obtained from publicly available data products or by conducting field work using low-cost methods feasible for local participants. Lastly, a methodology was developed to evaluate public health impacts of increased household water access resulting from domestic rainwater harvesting, incorporating knowledge of water requirements of sanitation and hygiene technologies. In 37 West African cities, domestic rainwater harvesting has the potential to reduce diarrheal disease burden by 9%, if implemented alone with 400 L storage. If implemented in conjunction with point of use treatment, this reduction could increase to 16%. The methodology will contribute to cost-effectiveness evaluations of interventions as well as evaluations of potential disease burden resulting from reduced water supply, such as reductions observed in the Bolivian communities.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As the agricultural non-point source pollution(ANPSP) has become the most significant threat for water environmental deterioration and lake eutrophication in China, more and more scientists and technologists are focusing on the control countermeasure and pollution mechanism of agricultural non-point source pollution. The unreasonable rural production structure and limited scientific management measures are the main reasons for acute ANSPS problems in China. At present, the problem for pollution control is a lack of specific regulations, which affects the government's management efficiency. According to these characteristics and problems, this paper puts forward some corresponding policies. The status of the agricultural non-point source pollution of China is analyzed, and ANSPS prevention and control model is provided based on governance policy, environmental legislation, technical system and subsidy policy. At last, the case analysis of Qiandao Lake is given, and an economic policy is adopted based on its situation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Virtually every sector of business and industry that uses computing, including financial analysis, search engines, and electronic commerce, incorporate Big Data analysis into their business model. Sophisticated clustering algorithms are popular for deducing the nature of data by assigning labels to unlabeled data. We address two main challenges in Big Data. First, by definition, the volume of Big Data is too large to be loaded into a computer’s memory (this volume changes based on the computer used or available, but there is always a data set that is too large for any computer). Second, in real-time applications, the velocity of new incoming data prevents historical data from being stored and future data from being accessed. Therefore, we propose our Streaming Kernel Fuzzy c-Means (stKFCM) algorithm, which reduces both computational complexity and space complexity significantly. The proposed stKFCM only requires O(n2) memory where n is the (predetermined) size of a data subset (or data chunk) at each time step, which makes this algorithm truly scalable (as n can be chosen based on the available memory). Furthermore, only 2n2 elements of the full N × N (where N >> n) kernel matrix need to be calculated at each time-step, thus reducing both the computation time in producing the kernel elements and also the complexity of the FCM algorithm. Empirical results show that stKFCM, even with relatively very small n, can provide clustering performance as accurately as kernel fuzzy c-means run on the entire data set while achieving a significant speedup.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Laser speckle contrast imaging (LSCI) has the potential to be a powerful tool in medicine, but more research in the field is required so it can be used properly. To help in the progression of Michigan Tech's research in the field, a graphical user interface (GUI) was designed in Matlab to control the instrumentation of the experiments as well as process the raw speckle images into contrast images while they are being acquired. The design of the system was successful and is currently being used by Michigan Tech's Biomedical Engineering department. This thesis describes the development of the LSCI GUI as well as offering a full introduction into the history, theory and applications of LSCI.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The federally endangered Karner blue butterfly (Lycaeides melissa samuelis Nabokov) persists in rare oak/pine grassland communities spanning across the Great Lakes region, relying on host plant wild blue lupine (Lupinus perennis). Conservation efforts since 1992 have led to the development of several programs that restore and monitor habitat. This study aims to evaluate Karner blue habitat selection in the state of Wisconsin and develop high-resolution tools for use in conservation efforts. Spatial predictive models developed during this study accurately predicted potential habitat across state properties based on soils and canopy cover, and identified ~51-100% of Karner blue occurrences based on lupine and shrub/tree cover, and focal nectar plant abundance. When evaluated relative to American bison (Bison bison), Karner blues and lupine were more likely to occur in areas of low disturbance, but aggregated where bison were recently present in areas of moderate/high disturbance. Lupine C:N ratio increased relative to cover of shrubs/trees and focal nectar plant abundance and decreased relative to cover of groundlitter. Karner blue density increased with lupine C:N ratio, decreased with nitrogen content, and was not related to phenolic levels. We strongly suggest that areas of different soil textures must be managed differently and that maintenance techniques should generate a mix of shrubs/tree cover (10-45%), groundlitter cover (~10-40%), >5% cover of lupine, and establish an abundance of focal nectar plants. This study provides unique tools for use in conservation and should aid in focusing management efforts and recovery of this species.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

How can we calculate earthquake magnitudes when the signal is clipped and over-run? When a volcano is very active, the seismic record may saturate (i.e., the full amplitude of the signal is not recorded) or be over-run (i.e., the end of one event is covered by the start of a new event). The duration, and sometimes the amplitude, of an earthquake signal are necessary for determining event magnitudes; thus, it may be impossible to calculate earthquake magnitudes when a volcano is very active. This problem is most likely to occur at volcanoes with limited networks of short period seismometers. This study outlines two methods for calculating earthquake magnitudes when events are clipped and over-run. The first method entails modeling the shape of earthquake codas as a power law function and extrapolating duration from the decay of the function. The second method draws relations between clipped duration (i.e., the length of time a signal is clipped) and the full duration. These methods allow for magnitudes to be determined within 0.2 to 0.4 units of magnitude. This error is within the range of analyst hand-picks and is within the acceptable limits of uncertainty when quickly quantifying volcanic energy release during volcanic crises. Most importantly, these estimates can be made when data are clipped or over-run. These methods were developed with data from the initial stages of the 2004-2008 eruption at Mount St. Helens. Mount St. Helens is a well-studied volcano with many instruments placed at varying distances from the vent. This fact makes the 2004-2008 eruption a good place to calibrate and refine methodologies that can be applied to volcanoes with limited networks.