1000 resultados para Michigan Tech


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This thesis will present strategies for the use of plug-in electric vehicles on smart and microgrids. MATLAB is used as the design tool for all models and simulations. First, a scenario will be explored using the dispatchable loads of electric vehicles to stabilize a microgrid with a high penetration of renewable power generation. Grid components for a microgrid with 50% photovoltaic solar production will be sized through an optimization routine to maintain storage system, load, and vehicle states over a 24-hour period. The findings of this portion are that the dispatchable loads can be used to guard against unpredictable losses in renewable generation output. Second, the use of distributed control strategies for the charging of electric vehicles utilizing an agent-based approach on a smart grid will be studied. The vehicles are regarded as additional loads to a primary forecasted load and use information transfer with the grid to make their charging decisions. Three lightweight control strategies and their effects on the power grid will be presented. The findings are that the charging behavior and peak loads on the grid can be reduced through the use of distributed control strategies.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Semi-active damping devices have been shown to be effective in mitigating unwanted vibrations in civil structures. These devices impart force indirectly through real-time alterations to structural properties. Simulating the complex behavior of these devices for laboratory-scale experiments is a major challenge. Commercial devices for seismic applications typically operate in the 2-10 kN range; this force is too high for small-scale testing applications where requirements typically range from 0-10 N. Several challenges must be overcome to produce damping forces at this level. In this study, a small-scale magneto-rheological (MR) damper utilizing a fluid absorbent metal foam matrix is developed and tested to accomplish this goal. This matrix allows magneto-rheological (MR) fluid to be extracted upon magnetic excitation in order to produce MR-fluid shear stresses and viscosity effects between an electromagnetic piston, the foam, and the damper housing. Dampers for uniaxial seismic excitation are traditionally positioned in the horizontal orientation allowing MR-fluid to gather in the lower part of the damper housing when partially filled. Thus, the absorbent matrix is placed in the bottom of the housing relieving the need to fill the entire device with MR-fluid, a practice that requires seals that add significant unwanted friction to the desired low-force device. The damper, once constructed, can be used in feedback control applications to reduce seismic vibrations and to test structural control algorithms and wireless command devices. To validate this device, a parametric study was performed utilizing force and acceleration measurements to characterize damper performance and controllability for this actuator. A discussion of the results is presented to demonstrate the attainment of the damper design objectives.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Following the rapid growth of China's economy, energy consumption, especially electricity consumption of China, has made a huge increase in the past 30 years. Since China has been using coal as the major energy source to produce electricity during these years, environmental problems have become more and more serious. The research question for this paper is: "Can China use alternative energies instead of coal to produce more electricity in 2030?" Hydro power, nuclear power, natural gas, wind power and solar power are considered as the possible and most popular alternative energies for the current situation of China. To answer the research question above, there are two things to know: How much is the total electricity consumption in China by 2030? And how much electricity can the alternative energies provide in China by 2030? For a more reliable forecast, an econometric model using the Ordinary Least Squares Method is established on this paper to predict the total electricity consumption by 2030. The predicted electricity coming from alternative energy sources by 2030 in China can be calculated from the existing literature. The research results of this paper are analyzed under a reference scenario and a max tech scenario. In the reference scenario, the combination of the alternative energies can provide 47.71% of the total electricity consumption by 2030. In the max tech scenario, it provides 57.96% of the total electricity consumption by 2030. These results are important not only because they indicate the government's long term goal is reachable, but also implies that the natural environment of China could have an inspiring future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Synthetic oligonucleotides and peptides have found wide applications in industry and academic research labs. There are ~60 peptide drugs on the market and over 500 under development. The global annual sale of peptide drugs in 2010 was estimated to be $13 billion. There are three oligonucleotide-based drugs on market; among them, the FDA newly approved Kynamro was predicted to have a $100 million annual sale. The annual sale of oligonucleotides to academic labs was estimated to be $700 million. Both bio-oligomers are mostly synthesized on automated synthesizers using solid phase synthesis technology, in which nucleoside or amino acid monomers are added sequentially until the desired full-length sequence is reached. The additions cannot be complete, which generates truncated undesired failure sequences. For almost all applications, these impurities must be removed. The most widely used method is HPLC. However, the method is slow, expensive, labor-intensive, not amendable for automation, difficult to scale up, and unsuitable for high throughput purification. It needs large capital investment, and consumes large volumes of harmful solvents. The purification costs are estimated to be more than 50% of total production costs. Other methods for bio-oligomer purification also have drawbacks, and are less favored than HPLC for most applications. To overcome the problems of known biopolymer purification technologies, we have developed two non-chromatographic purification methods. They are (1) catching failure sequences by polymerization, and (2) catching full-length sequences by polymerization. In the first method, a polymerizable group is attached to the failure sequences of the bio-oligomers during automated synthesis; purification is achieved by simply polymerizing the failure sequences into an insoluble gel and extracting full-length sequences. In the second method, a polymerizable group is attached to the full-length sequences, which are then incorporated into a polymer; impurities are removed by washing, and pure product is cleaved from polymer. These methods do not need chromatography, and all drawbacks of HPLC no longer exist. Using them, purification is achieved by simple manipulations such as shaking and extraction. Therefore, they are suitable for large scale purification of oligonucleotide and peptide drugs, and also ideal for high throughput purification, which currently has a high demand for research projects involving total gene synthesis. The dissertation will present the details about the development of the techniques. Chapter 1 will make an introduction to oligodeoxynucleotides (ODNs), their synthesis and purification. Chapter 2 will describe the detailed studies of using the catching failure sequences by polymerization method to purify ODNs. Chapter 3 will describe the further optimization of the catching failure sequences by polymerization ODN purification technology to the level of practical use. Chapter 4 will present using the catching full-length sequence by polymerization method for ODN purification using acid-cleavable linker. Chapter 5 will make an introduction to peptides, their synthesis and purification. Chapter 6 will describe the studies using the catching full-length sequence by polymerization method for peptide purification.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As microgrid power systems gain prevalence and renewable energy comprises greater and greater portions of distributed generation, energy storage becomes important to offset the higher variance of renewable energy sources and maximize their usefulness. One of the emerging techniques is to utilize a combination of lead-acid batteries and ultracapacitors to provide both short and long-term stabilization to microgrid systems. The different energy and power characteristics of batteries and ultracapacitors imply that they ought to be utilized in different ways. Traditional linear controls can use these energy storage systems to stabilize a power grid, but cannot effect more complex interactions. This research explores a fuzzy logic approach to microgrid stabilization. The ability of a fuzzy logic controller to regulate a dc bus in the presence of source and load fluctuations, in a manner comparable to traditional linear control systems, is explored and demonstrated. Furthermore, the expanded capabilities (such as storage balancing, self-protection, and battery optimization) of a fuzzy logic system over a traditional linear control system are shown. System simulation results are presented and validated through hardware-based experiments. These experiments confirm the capabilities of the fuzzy logic control system to regulate bus voltage, balance storage elements, optimize battery usage, and effect self-protection.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

As continued global funding and coordination are allocated toward the improvement of access to safe sources of drinking water, alternative solutions may be necessary to expand implementation to remote communities. This report evaluates two technologies used in a small water distribution system in a mountainous region of Panama; solar powered pumping and flow-reducing discs. The two parts of the system function independently, but were both chosen for their ability to mitigate unique issues in the community. The design program NeatWork and flow-reducing discs were evaluated because they are tools taught to Peace Corps Volunteers in Panama. Even when ample water is available, mountainous terrains affect the pressure available throughout a water distribution system. Since the static head in the system only varies with the height of water in the tank, frictional losses from pipes and fittings must be exploited to balance out the inequalities caused by the uneven terrain. Reducing the maximum allowable flow to connections through the installation of flow-reducing discs can help to retain enough residual pressure in the main distribution lines to provide reliable service to all connections. NeatWork was calibrated to measured flow rates by changing the orifice coefficient (θ), resulting in a value of 0.68, which is 10-15% higher than typical values for manufactured flow-reducing discs. NeatWork was used to model various system configurations to determine if a single-sized flow-reducing disc could provide equitable flow rates throughout an entire system. There is a strong correlation between the optimum single-sized flow- reducing disc and the average elevation change throughout a water distribution system; the larger the elevation change across the system, the smaller the recommended uniform orifice size. Renewable energy can jump the infrastructure gap and provide basic services at a fraction of the cost and time required to install transmission lines. Methods for the assessment of solar powered pumping systems as a means for rural water supply are presented and assessed. It was determined that manufacturer provided product specifications can be used to appropriately design a solar pumping system, but care must be taken to ensure that sufficient water can be provided to the system despite variations in solar intensity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This dissertation established a standard foam index: the absolute foam index test. This test characterized a wide range of coal fly ash by the absolute volume of air-entraining admixture (AEA) necessary to produce a 15-second metastable foam in a coal fly ash-cement slurry in a specified time. The absolute foam index test was used to characterize fly ash samples having loss on ignition (LOI) values that ranged from 0.17 to 23.3 %wt. The absolute foam index characterized the fly ash samples by absolute volume of AEA, defined as the amount of undiluted AEA solution added to obtain a 15-minute endpoint signified by 15-second metastable foam. Results were compared from several foam index test time trials that used different initial test concentrations to reach termination at selected times. Based on the coefficient of variation (CV), a 15-minute endpoint, with limits of 12 to 18 minutes was chosen. Various initial test concentrations were used to accomplish consistent contact times and concentration gradients for the 15-minute test endpoint for the fly ash samples. A set of four standard concentrations for the absolute foam index test were defined by regression analyses and a procedure simplifying the test process. The set of standard concentrations for the absolute foam index test was determined by analyzing experimental results of 80 tests on coal fly ashes with loss on ignition (LOI) values ranging from 0.39 to 23.3 wt.%. A regression analysis informed selection of four concentrations (2, 6, 10, and 15 vol.% AEA) that are expected to accommodate fly ashes with 0.39 to 23.3 wt.% LOI, depending on the AEA type. Higher concentrations should be used for high-LOI fly ash when necessary. A procedure developed using these standard concentrations is expected to require only 1-3 trials to meet specified endpoint criteria for most fly ashes. The AEA solution concentration that achieved the metastable foam in the foam index test was compared to the AEA equilibrium concentration obtained from the direct adsorption isotherm test with the same fly ash. The results showed that the AEA concentration that satisfied the absolute foam index test was much less than the equilibrium concentration. This indicated that the absolute foam index test was not at or near equilibrium. Rather, it was a dynamic test where the time of the test played an important role in the results. Even though the absolute foam index was not an equilibrium condition, a correlation was made between the absolute foam index and adsorption isotherms. Equilibrium isotherm equations obtained from direct isotherm tests were used to calculate the equilibrium concentrations and capacities of fly ash from 0.17 to 10.5% LOI. The results showed that the calculated fly ash capacity was much less than capacities obtained from isotherm tests that were conducted with higher initial concentrations. This indicated that the absolute foam index was not equilibrium. Rather, the test is dynamic where the time of the test played an important role in the results. Even though the absolute foam index was not an equilibrium condition, a correlation was made between the absolute foam index and adsorption isotherms for fly ash of 0.17 to 10.5% LOI. Several batches of mortars were mixed for the same fly ash type increasing only the AEA concentration (dosage) in each subsequent batch. Mortar air test results for each batch showed for each increase in AEA concentration, air contents increased until a point where the next increase in AEA concentration resulted in no increase in air content. This was maximum air content that could be achieved by the particular mortar system; the system reached its air capacity at the saturation limit. This concentration of AEA was compared to the critical micelle concentration (CMC) for the AEA and the absolute foam index.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Clouds are one of the most influential elements of weather on the earth system, yet they are also one of the least understood. Understanding their composition and behavior at small scales is critical to understanding and predicting larger scale feedbacks. Currently, the best method to study clouds on the microscale is through airborne in situ measurements using optical instruments capable of resolving clouds on the individual particle level. However, current instruments are unable to sufficiently resolve the scales important to cloud evolution and behavior. The Holodec is a new generation of optical cloud instrument which uses digital inline holography to overcome many of the limitations of conventional instruments. However, its performance and reliability was limited due to several deficiencies in its original design. These deficiencies were addressed and corrected to advance the instrument from the prototype stage to an operational instrument. In addition, the processing software used to reconstruct and analyze digitally recorded holograms was improved upon to increase robustness and ease of use.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In power electronic basedmicrogrids, the computational requirements needed to implement an optimized online control strategy can be prohibitive. The work presented in this dissertation proposes a generalized method of derivation of geometric manifolds in a dc microgrid that is based on the a-priori computation of the optimal reactions and trajectories for classes of events in a dc microgrid. The proposed states are the stored energies in all the energy storage elements of the dc microgrid and power flowing into them. It is anticipated that calculating a large enough set of dissimilar transient scenarios will also span many scenarios not specifically used to develop the surface. These geometric manifolds will then be used as reference surfaces in any type of controller, such as a sliding mode hysteretic controller. The presence of switched power converters in microgrids involve different control actions for different system events. The control of the switch states of the converters is essential for steady state and transient operations. A digital memory look-up based controller that uses a hysteretic sliding mode control strategy is an effective technique to generate the proper switch states for the converters. An example dcmicrogrid with three dc-dc boost converters and resistive loads is considered for this work. The geometric manifolds are successfully generated for transient events, such as step changes in the loads and the sources. The surfaces corresponding to a specific case of step change in the loads are then used as reference surfaces in an EEPROM for experimentally validating the control strategy. The required switch states corresponding to this specific transient scenario are programmed in the EEPROM as a memory table. This controls the switching of the dc-dc boost converters and drives the system states to the reference manifold. In this work, it is shown that this strategy effectively controls the system for a transient condition such as step changes in the loads for the example case.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Membrane filtration has become an accepted technology for the removal of pathogens from drinking water. Viruses, known to contaminate water supplies, are too small to be removed by a size-exclusion mechanism without a large energy penalty. Thus, functionalized electrospun membranes that can adsorb viruses have drawn our interest. We chose a quaternized chitosan derivative (HTCC) which carries a positively-charged quaternary amine, known to bind negatively-charged virus particles, as a functionalized membrane material. The technique of electrospinning was utilized to produce nanofiber mats with large pore diameters to increase water flux and decrease membrane fouling. In this study, stable, functionalized, electrospun HTCC-PVA nanofibers that can remove 3.6 logs (99.97%) of a model virus, porcine parvovirus (PPV), from water by adsorption and filtration have been successfully produced. This technology has the potential to purify drinking water in undeveloped countries and reduce the number of deaths due to lack of sanitation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Free-radical retrograde-precipitation polymerization, FRRPP in short, is a novel polymerization process discovered by Dr. Gerard Caneba in the late 1980s. The current study is aimed at gaining a better understanding of the reaction mechanism of the FRRPP and its thermodynamically-driven features that are predominant in controlling the chain reaction. A previously developed mathematical model to represent free radical polymerization kinetics was used to simulate a classic bulk polymerization system from the literature. Unlike other existing models, such a sparse-matrix-based representation allows one to explicitly accommodate the chain length dependent kinetic parameters. Extrapolating from the past results, mixing was experimentally shown to be exerting a significant influence on reaction control in FRRPP systems. Mixing alone drives the otherwise severely diffusion-controlled reaction propagation in phase-separated polymer domains. Therefore, in a quiescent system, in the absence of mixing, it is possible to retard the growth of phase-separated domains, thus producing isolated polymer nanoparticles (globules). Such a diffusion-controlled, self-limiting phenomenon of chain growth was also observed using time-resolved small angle x-ray scattering studies of reaction kinetics in quiescent systems of FRRPP. Combining the concept of self-limiting chain growth in quiescent FRRPP systems with spatioselective reaction initiation of lithography, microgel structures were synthesized in a single step, without the use of molds or additives. Hard x-rays from the bending magnet radiation of a synchrotron were used as an initiation source, instead of the more statistally-oriented chemical initiators. Such a spatially-defined reaction was shown to be self-limiting to the irradiated regions following a polymerization-induced self-assembly phenomenon. The pattern transfer aspects of this technique were, therefore, studied in the FRRP polymerization of N-isopropylacrylamide (NIPAm) and methacrylic acid (MAA), a thermoreversible and ionic hydrogel, respectively. Reaction temperature increases the contrast between the exposed and unexposed zones of the formed microgels, while the irradiation dose is directly proportional to the extent of phase separation. The response of Poly (NIPAm) microgels prepared from the technique described in this study was also characterized by small angle neutron scattering.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In 1970 Clark Benson published a theorem in the Journal of Algebra stating a congruence for generalized quadrangles. Since then this theorem has been expanded to other specific geometries. In this thesis the theorem for partial geometries is extended to develop new divisibility conditions for the existence of a partial geometry in Chapter 2. Then in Chapter 3 the theorem is applied to higher dimensional arcs resulting in parameter restrictions on geometries derived from these structures. In Chapter 4 we look at extending previous work with partial geometries with α = 2 to uncover potential partial geometries with higher values of α. Finally the theorem is extended to strongly regular graphs in Chapter 5. In addition we obtain expressions for the multiplicities of the eigenvalues of matrices related to the adjacency matrices of these graphs. Finally, a four lesson high school level enrichment unit is included to provide students at this level with an introduction to partial geometries, strongly regular graphs, and an opportunity to develop proof skills in this new context.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The objective for this thesis is to outline a Performance-Based Engineering (PBE) framework to address the multiple hazards of Earthquake (EQ) and subsequent Fire Following Earthquake (FFE). Currently, fire codes for the United States are largely empirical and prescriptive in nature. The reliance on prescriptive requirements makes quantifying sustained damage due to fire difficult. Additionally, the empirical standards have resulted from individual member or individual assembly furnace testing, which have been shown to differ greatly from full structural system behavior. The very nature of fire behavior (ignition, growth, suppression, and spread) is fundamentally difficult to quantify due to the inherent randomness present in each stage of fire development. The study of interactions between earthquake damage and fire behavior is also in its infancy with essentially no available empirical testing results. This thesis will present a literature review, a discussion, and critique of the state-of-the-art, and a summary of software currently being used to estimate loss due to EQ and FFE. A generalized PBE framework for EQ and subsequent FFE is presented along with a combined hazard probability to performance objective matrix and a table of variables necessary to fully implement the proposed framework. Future research requirements and summary are also provided with discussions of the difficulties inherent in adequately describing the multiple hazards of EQ and FFE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Acoustic emission (AE) technique, as one of non-intrusive and nondestructive evaluation techniques, acquires and analyzes the signals emitting from deformation or fracture of materials/structures under service loading. The AE technique has been successfully applied in damage detection in various materials such as metal, alloy, concrete, polymers and other composite materials. In this study, the AE technique was used for detecting crack behavior within concrete specimens under mechanical and environmental frost loadings. The instrumentations of the AE system used in this study include a low-frequency AE sensor, a computer-based data acquisition device and a preamplifier linking the AE sensor and the data acquisition device. The AE system purchased from Mistras Group was used in this study. The AE technique was applied to detect damage with the following laboratory tests: the pencil lead test, the mechanical three-point single-edge notched beam bending (SEB) test, and the freeze-thaw damage test. Firstly, the pencil lead test was conducted to verify the attenuation phenomenon of AE signals through concrete materials. The value of attenuation was also quantified. Also, the obtained signals indicated that this AE system was properly setup to detect damage in concrete. Secondly, the SEB test with lab-prepared concrete beam was conducted by employing Mechanical Testing System (MTS) and AE system. The cumulative AE events and the measured loading curves, which both used the crack-tip open displacement (CTOD) as the horizontal coordinate, were plotted. It was found that the detected AE events were qualitatively correlated with the global force-displacement behavior of the specimen. The Weibull distribution was vii proposed to quantitatively describe the rupture probability density function. The linear regression analysis was conducted to calibrate the Weibull distribution parameters with detected AE signals and to predict the rupture probability as a function of CTOD for the specimen. Finally, the controlled concrete freeze-thaw cyclic tests were designed and the AE technique was planned to investigate the internal frost damage process of concrete specimens.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Principal Component Analysis (PCA) is a popular method for dimension reduction that can be used in many fields including data compression, image processing, exploratory data analysis, etc. However, traditional PCA method has several drawbacks, since the traditional PCA method is not efficient for dealing with high dimensional data and cannot be effectively applied to compute accurate enough principal components when handling relatively large portion of missing data. In this report, we propose to use EM-PCA method for dimension reduction of power system measurement with missing data, and provide a comparative study of traditional PCA and EM-PCA methods. Our extensive experimental results show that EM-PCA method is more effective and more accurate for dimension reduction of power system measurement data than traditional PCA method when dealing with large portion of missing data set.