871 resultados para Optimal design of experiments


Relevância:

100.00% 100.00%

Publicador:

Resumo:

From the customer satisfaction point of view, sound quality of any product has become one of the important factors these days. The primary objective of this research is to determine factors which affect the acceptability of impulse noise. Though the analysis is based on a sample impulse sound file of a Commercial printer, the results can be applied to other similar impulsive noise. It is assumed that impulsive noise can be tuned to meet the accepTable criteria. Thus it is necessary to find the most significant factors which can be controlled physically. This analysis is based on a single impulse. A sample impulsive sound file is tweaked for different amplitudes, background noise, attack time, release time and the spectral content. A two level factorial design of experiments (DOE) is applied to study the significant effects and interactions. For each impulse file modified as per the DOE, the magnitude of perceived annoyance is calculated from the objective metric developed recently at Michigan Technological University. This metric is based on psychoacoustic criteria such as loudness, sharpness, roughness and loudness based impulsiveness. Software called ‘Artemis V11.2’ developed by HEAD Acoustics is used to calculate these psychoacoustic terms. As a result of two level factorial analyses, a new objective model of perceived annoyance is developed in terms of above mentioned physical parameters such as amplitudes, background noise, impulse attack time, impulse release time and the spectral content. Also the effects of the significant individual factors as well as two level interactions are also studied. The results show that all the mentioned five factors affect annoyance level of an impulsive sound significantly. Thus annoyance level can be reduced under the criteria by optimizing the levels. Also, an additional analysis is done to study the effect of these five significant parameters on the individual psychoacoustic metrics.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Environmental Process and Simulation Center (EPSC) at Michigan Technological University started accommodating laboratories for an Environmental Engineering senior level class CEE 4509 Environmental Process and Simulation Laboratory since 2004. Even though the five units that exist in EPSC provide the students opportunities to have hands-on experiences with a wide range of water/wastewater treatment technologies, a key module was still missing for the student to experience a full cycle of treatment. This project fabricated a direct-filtration pilot system in EPSC and generated a laboratory manual for education purpose. Engineering applications such as clean bed head loss calculation, backwash flowrate determination, multimedia density calculation and run length prediction are included in the laboratory manual. The system was tested for one semester and modifications have been made both to the direct filtration unit and the laboratory manual. Future work is also proposed to further refine the module.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The goal of this research is to provide a framework for vibro-acoustical analysis and design of a multiple-layer constrained damping structure. The existing research on damping and viscoelastic damping mechanism is limited to the following four mainstream approaches: modeling techniques of damping treatments/materials; control through the electrical-mechanical effect using the piezoelectric layer; optimization by adjusting the parameters of the structure to meet the design requirements; and identification of the damping material’s properties through the response of the structure. This research proposes a systematic design methodology for the multiple-layer constrained damping beam giving consideration to vibro-acoustics. A modeling technique to study the vibro-acoustics of multiple-layered viscoelastic laminated beams using the Biot damping model is presented using a hybrid numerical model. The boundary element method (BEM) is used to model the acoustical cavity whereas the Finite Element Method (FEM) is the basis for vibration analysis of the multiple-layered beam structure. Through the proposed procedure, the analysis can easily be extended to other complex geometry with arbitrary boundary conditions. The nonlinear behavior of viscoelastic damping materials is represented by the Biot damping model taking into account the effects of frequency, temperature and different damping materials for individual layers. A curve-fitting procedure used to obtain the Biot constants for different damping materials for each temperature is explained. The results from structural vibration analysis for selected beams agree with published closed-form results and results for the radiated noise for a sample beam structure obtained using a commercial BEM software is compared with the acoustical results of the same beam with using the Biot damping model. The extension of the Biot damping model is demonstrated to study MDOF (Multiple Degrees of Freedom) dynamics equations of a discrete system in order to introduce different types of viscoelastic damping materials. The mechanical properties of viscoelastic damping materials such as shear modulus and loss factor change with respect to different ambient temperatures and frequencies. The application of multiple-layer treatment increases the damping characteristic of the structure significantly and thus helps to attenuate the vibration and noise for a broad range of frequency and temperature. The main contributions of this dissertation include the following three major tasks: 1) Study of the viscoelastic damping mechanism and the dynamics equation of a multilayer damped system incorporating the Biot damping model. 2) Building the Finite Element Method (FEM) model of the multiple-layer constrained viscoelastic damping beam and conducting the vibration analysis. 3) Extending the vibration problem to the Boundary Element Method (BEM) based acoustical problem and comparing the results with commercial simulation software.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

When a concrete slab experiences differential volume change due to temperature, moisture, and shrinkage gradients, it deforms. The stresses induced by these differential volume changes can reduce the pavement’s fatigue life. Differential volume change is quantified by the equivalent temperature difference required to deform a comparable flat slab to the same shape as the actual slab. This thesis presents models to predict the equivalent temperature difference due to moisture warping and differential drying shrinkage. Moisture warping occurs because a portion of drying shrinkage is reversible, while differential drying shrinkage is due to the irreversible portion of drying shrinkage. The amount of reversible shrinkage was investigated for concretes made with different types of aggregate, including lightweight and recycled. Another source of differential volume change is built-in curl, which is caused by temperature gradients at the time of paving. This thesis also presents a comparison of methods used to quantify built-in curl.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Self-stabilization is a property of a distributed system such that, regardless of the legitimacy of its current state, the system behavior shall eventually reach a legitimate state and shall remain legitimate thereafter. The elegance of self-stabilization stems from the fact that it distinguishes distributed systems by a strong fault tolerance property against arbitrary state perturbations. The difficulty of designing and reasoning about self-stabilization has been witnessed by many researchers; most of the existing techniques for the verification and design of self-stabilization are either brute-force, or adopt manual approaches non-amenable to automation. In this dissertation, we first investigate the possibility of automatically designing self-stabilization through global state space exploration. In particular, we develop a set of heuristics for automating the addition of recovery actions to distributed protocols on various network topologies. Our heuristics equally exploit the computational power of a single workstation and the available parallelism on computer clusters. We obtain existing and new stabilizing solutions for classical protocols like maximal matching, ring coloring, mutual exclusion, leader election and agreement. Second, we consider a foundation for local reasoning about self-stabilization; i.e., study the global behavior of the distributed system by exploring the state space of just one of its components. It turns out that local reasoning about deadlocks and livelocks is possible for an interesting class of protocols whose proof of stabilization is otherwise complex. In particular, we provide necessary and sufficient conditions – verifiable in the local state space of every process – for global deadlock- and livelock-freedom of protocols on ring topologies. Local reasoning potentially circumvents two fundamental problems that complicate the automated design and verification of distributed protocols: (1) state explosion and (2) partial state information. Moreover, local proofs of convergence are independent of the number of processes in the network, thereby enabling our assertions about deadlocks and livelocks to apply on rings of arbitrary sizes without worrying about state explosion.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In recent years there has been a tremendous amount of research in the area of nanotechnology. History tells us that the commercialization of technologies will always be accompanied by both positive and negative effects for society and the environment. Products containing nanomaterials are already available in the market, and yet there is still not much information regarding the potential negative effects that these products may cause. The work presented in this dissertation describes a holistic approach to address different dimensions of nanotechnology sustainability. Life cycle analysis (LCA) was used to study the potential usage of polyethylene filled with nanomaterials to manufacture automobile body panels. Results showed that the nanocomposite does not provide an environmental benefit over traditional steel panels. A new methodology based on design of experiments (DOE) techniques, coupled with LCA, was implemented to investigate the impact of inventory uncertainties. Results showed that data variability does not have a significant effect on the prediction of the environmental impacts. Material profiles for input materials did have a highly significant effect on the overall impact. Energy consumption and material characterization were identified as two mainstreams where additional research is needed in order to predict the overall impact of nanomaterials more effectively. A study was undertaken to gain insights into the behavior of small particles in contact with a surface exposed to air flow to determine particle lift-off from the surface. A mapping strategy was implemented that allows for the identification of conditions for particle liftoff based on particle size and separation distance from the wall. Main results showed that particles smaller than 0:1mm will not become airborne under shear flow unless the separation distance is greater than 15 nm. Results may be used to minimize exposure to airborne materials. Societal implications that may occur in the workplace were researched. This research task explored different topics including health, ethics, and worker perception with the aim of identifying the base knowledge available in the literature. Recommendations are given for different scenarios to describe how workers and employers could minimize the unwanted effects of nanotechnology production.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neuromorphic computing has become an emerging field in wide range of applications. Its challenge lies in developing a brain-inspired architecture that can emulate human brain and can work for real time applications. In this report a flexible neural architecture is presented which consists of 128 X 128 SRAM crossbar memory and 128 spiking neurons. For Neuron, digital integrate and fire model is used. All components are designed in 45nm technology node. The core can be configured for certain Neuron parameters, Axon types and synapses states and are fully digitally implemented. Learning for this architecture is done offline. To train this circuit a well-known algorithm Restricted Boltzmann Machine (RBM) is used and linear classifiers are trained at the output of RBM. Finally, circuit was tested for handwritten digit recognition application. Future prospects for this architecture are also discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

For a microgrid with a high penetration level of renewable energy, energy storage use becomes more integral to the system performance due to the stochastic nature of most renewable energy sources. This thesis examines the use of droop control of an energy storage source in dc microgrids in order to optimize a global cost function. The approach involves using a multidimensional surface to determine the optimal droop parameters based on load and state of charge. The optimal surface is determined using knowledge of the system architecture and can be implemented with fully decentralized source controllers. The optimal surface control of the system is presented. Derivations of a cost function along with the implementation of the optimal control are included. Results were verified using a hardware-in-the-loop system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although the nature of the blood groups A and B has been comprehensively studied for a long time, it is still unclear as to what exactly is the epitope that is recognized by antibodies having AB specificity, i.e. monoclonal and polyclonal antibodies which are capable of interacting equally well with the antigens GalNAcalpha 1-3(Fucalpha 1-2)Gal (A trisaccharide) and Galalpha 1-3(Fucalpha 1-2)Gal (B trisaccharide), but do not react with their common fragment Fucalpha 1-2Gal. We have supposed that besides Fucalpha 1-2Gal, A and B antigens have one more shared epitope. The trisaccharides A and B are practically identical from the conformational point of view, the only difference being situated at position 2 of Galalpha residue, i.e. trisaccharide A has a NHAc group, whereas trisaccharide B has a hydroxyl group (see formulas). We have hypothesized that the AB-epitope should be situated in the part of the molecule that is opposite to the NHAc group of GalNAc residue. In order to test this hypothesis we have synthesized a polymeric conjugate in such a way that de-N-acetylated A-trisaccharide is attached to a polymer via the nitrogen in position C-2 of the galactosamine residue. In this conjugate the supposed AB-epitope should be maximally accessible for antibodies from the solution, whereas the discrimination site of antigens A and B by the antibodies should be maximally hidden due to the close proximity of the polymer. Interaction with several anti-AB monoclonal antibodies revealed that a part of them really interacted with the synthetic AB-glycotope, thus confirming our hypothesis. Moreover, similar antibodies were revealed in the blood of healthy blood group 0 donors. Analysis of spatial models was performed in addition to identify the hydroxyl groups of Fuc, Galalpha, and Galbeta residues, which are particularly involved in the composition of the AB-glycotope.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Ischemic stroke is the leading cause of mortality worldwide and a major contributor to neurological disability and dementia. Terutroban is a specific TP receptor antagonist with antithrombotic, antivasoconstrictive, and antiatherosclerotic properties, which may be of interest for the secondary prevention of ischemic stroke. This article describes the rationale and design of the Prevention of cerebrovascular and cardiovascular Events of ischemic origin with teRutroban in patients with a history oF ischemic strOke or tRansient ischeMic Attack (PERFORM) Study, which aims to demonstrate the superiority of the efficacy of terutroban versus aspirin in secondary prevention of cerebrovascular and cardiovascular events. METHODS AND RESULTS: The PERFORM Study is a multicenter, randomized, double-blind, parallel-group study being carried out in 802 centers in 46 countries. The study population includes patients aged > or =55 years, having suffered an ischemic stroke (< or =3 months) or a transient ischemic attack (< or =8 days). Participants are randomly allocated to terutroban (30 mg/day) or aspirin (100 mg/day). The primary efficacy endpoint is a composite of ischemic stroke (fatal or nonfatal), myocardial infarction (fatal or nonfatal), or other vascular death (excluding hemorrhagic death of any origin). Safety is being evaluated by assessing hemorrhagic events. Follow-up is expected to last for 2-4 years. Assuming a relative risk reduction of 13%, the expected number of primary events is 2,340. To obtain statistical power of 90%, this requires inclusion of at least 18,000 patients in this event-driven trial. The first patient was randomized in February 2006. CONCLUSIONS: The PERFORM Study will explore the benefits and safety of terutroban in secondary cardiovascular prevention after a cerebral ischemic event.