7 resultados para Effective method
em Digital Commons - Michigan Tech
Resumo:
Noise and vibration has long been sought to be reduced in major industries: automotive, aerospace and marine to name a few. Products must be tested and pass certain levels of federally regulated standards before entering the market. Vibration measurements are commonly acquired using accelerometers; however limitations of this method create a need for alternative solutions. Two methods for non-contact vibration measurements are compared: Laser Vibrometry, which directly measures the surface velocity of the aluminum plate, and Nearfield Acoustic Holography (NAH), which measures sound pressure in the nearfield, and using Green’s Functions, reconstructs the surface velocity at the plate. The surface velocity from each method is then used in modal analysis to determine the comparability of frequency, damping and mode shapes. Frequency and mode shapes are also compared to an FEA model. Laser Vibrometry is a proven, direct method for determining surface velocity and subsequently calculating modal analysis results. NAH is an effective method in locating noise sources, especially those that are not well separated spatially. Little work has been done in incorporating NAH into modal analysis.
Resumo:
Dolomite [CaMg(CO3)2] is an intolerable impurity in phosphate ores due to its MgO content. Traditionally, the Florida phosphate industry has avoided mining high-MgO phosphate reserves due to the lack of an economically viable process for removal of dolomite. However, as the high grade phosphate reserves become depleted, more emphasis is being put on the development of a cost effective method for separating dolomite from high-MgO phosphate ores. In general, the phosphate industry demands a phosphate concentrate containing less than 1%MgO. Dolomite impurities have mineralogical properties that are very similar to the desired phosphate minerals (francolite), making the separation of the two minerals very difficult. Magnesium is primarily found as distinct dolomite-rich pebbles, very fine dolomite inclusions in predominately francolite pebbles, and magnesium substituted into the francolite structure. Jigging is a gravity separation process that attempts to take advantage of the density difference between the dolomite and francolite pebbles. A unique laboratory scale jig was designed and built at Michigan Tech for this study. Through a series of tests it was found that a pulsation rate of 200 pulse/minute, a stroke length of 1 inch, a water addition rate of 0.5gpm, and alumina ragging balls were optimum for this study. To investigate the feasibility of jigging for the removal of dolomite from phosphate ore, two high-MgO phosphate ores were tested using optimized jigging parameters: (1) Plant #1 was sized to 4.00x0.85mm and contained 1.55%MgO; (2) Plant #2 was sized to 3.40mmx0.85mm and contained 3.07% MgO. A sample from each plant was visually separated by hand into dolomite and francolite rich fractions, which were then analyzed to determine the minimum achievable MgO levels. For Plant #1 phosphate ore, a concentrate containing 0.89%MgO was achieved at a recovery of 32.0%BPL. For Plant #2, a phosphate concentrate containing 1.38%MgO was achieved at a recovery of 74.7%BPL. Minimum achievable MgO levels were determined to be 0.53%MgO for Plant #1 and 1.15%MgO for Plant #2.
Resumo:
Viral infections account for over 13 million deaths per year. Antiviral drugs and vaccines are the most effective method to treat viral diseases. Antiviral compounds have revolutionized the treatment of AIDS, and reduced the mortality rate. However, this disease still causes a large number of deaths in developing countries that lack these types of drugs. Vaccination is the most effective method to treat viral disease; vaccines prevent around 2.5 million deaths per year. Vaccines are not able to offer full coverage due to high operational costs in the manufacturing processes. Although vaccines have saved millions of lives, conventional vaccines often offer reactogenic effects. New technologies have been created to eliminate the undesired side effects. However, new vaccines are less immunogenic and adjuvants such as vaccine delivery vehicles are required. This work focuses on the discovery of new natural antivirals that can reduce the high cost and side effects of synthetic drugs. We discovered that two osmolytes, trimethylamine N-oxide (TMAO) and glycine reduce the infectivity of a model virus, porcine parvovirus (PPV), by 4 LRV (99.99%), likely by disruption of capsid assembly. These osmolytes have the potential to be used as drugs, since they showed antiviral activity after 20 h. We have also focused on improving current vaccine manufacturing processes that will allow fast, effective and economical vaccines to be produced worldwide. We propose virus flocculation in osmolytes followed by microfiltration as an economical alternative for vaccine manufacturing. Osmolytes are able to specifically flocculate hydrophobic virus particles by depleting a hydration layer around the particles and subsequently cause virus aggregation. The osmolyte mannitol was able to flocculate virus particles, and demonstrate a high virus removal, 81% for PPV and 98.1% for Sindbis virus (SVHR). Virus flocculation with mannitol, followed by microfiltration could be used as a platform process for virus purification. Finally, we perform biocompatibility studies on soft-templated mesoporous carbon materials with the aim of using these materials as vaccine delivery vehicles. We discovered that these materials are biocompatible, and the degree of biocompatibility is within the range of other biomaterials currently employed in biomedical applications.
Resumo:
Compiler optimizations help to make code run faster at runtime. When the compilation is done before the program is run, compilation time is less of an issue, but how do on-the-fly compilation and optimization impact the overall runtime? If the compiler must compete with the running application for resources, the running application will take more time to complete. This paper investigates the impact of specific compiler optimizations on the overall runtime of an application. A foldover Plackett and Burman design is used to choose compiler optimizations that appear to contribute to shorter overall runtimes. These selected optimizations are compared with the default optimization levels in the Jikes RVM. This method selects optimizations that result in a shorter overall runtime than the default O0, O1, and O2 levels. This shows that careful selection of compiler optimizations can have a significant, positive impact on overall runtime.
Resumo:
Determining how an exhaust system will perform acoustically before a prototype muffler is built can save the designer both a substantial amount of time and resources. In order to effectively use the simulation tools available it is important to understand what is the most effective tool for the intended purpose of analysis as well as how typical elements in an exhaust system affect muffler performance. An in-depth look at the available tools and their most beneficial uses are presented in this thesis. A full parametric study was conducted using the FEM method for typical muffler elements which was also correlated to experimental results. This thesis lays out the overall ground work on how to accurately predict sound pressure levels in the free field for an exhaust system with the engine properties included. The accuracy of the model is heavily dependent on the correct temperature profile of the model in addition to the accuracy of the source properties. These factors will be discussed in detail and methods for determining them will be presented. The secondary effects of mean flow, which affects both the acoustical wave propagation and the flow noise generation, will be discussed. Effective ways for predicting these secondary effects will be described. Experimental models will be tested on a flow rig that showcases these phenomena.
Resumo:
Direct sampling methods are increasingly being used to solve the inverse medium scattering problem to estimate the shape of the scattering object. A simple direct method using one incident wave and multiple measurements was proposed by Ito, Jin and Zou. In this report, we performed some analytic and numerical studies of the direct sampling method. The method was found to be effective in general. However, there are a few exceptions exposed in the investigation. Analytic solutions in different situations were studied to verify the viability of the method while numerical tests were used to validate the effectiveness of the method.
Resumo:
Credible spatial information characterizing the structure and site quality of forests is critical to sustainable forest management and planning, especially given the increasing demands and threats to forest products and services. Forest managers and planners are required to evaluate forest conditions over a broad range of scales, contingent on operational or reporting requirements. Traditionally, forest inventory estimates are generated via a design-based approach that involves generalizing sample plot measurements to characterize an unknown population across a larger area of interest. However, field plot measurements are costly and as a consequence spatial coverage is limited. Remote sensing technologies have shown remarkable success in augmenting limited sample plot data to generate stand- and landscape-level spatial predictions of forest inventory attributes. Further enhancement of forest inventory approaches that couple field measurements with cutting edge remotely sensed and geospatial datasets are essential to sustainable forest management. We evaluated a novel Random Forest based k Nearest Neighbors (RF-kNN) imputation approach to couple remote sensing and geospatial data with field inventory collected by different sampling methods to generate forest inventory information across large spatial extents. The forest inventory data collected by the FIA program of US Forest Service was integrated with optical remote sensing and other geospatial datasets to produce biomass distribution maps for a part of the Lake States and species-specific site index maps for the entire Lake State. Targeting small-area application of the state-of-art remote sensing, LiDAR (light detection and ranging) data was integrated with the field data collected by an inexpensive method, called variable plot sampling, in the Ford Forest of Michigan Tech to derive standing volume map in a cost-effective way. The outputs of the RF-kNN imputation were compared with independent validation datasets and extant map products based on different sampling and modeling strategies. The RF-kNN modeling approach was found to be very effective, especially for large-area estimation, and produced results statistically equivalent to the field observations or the estimates derived from secondary data sources. The models are useful to resource managers for operational and strategic purposes.