33 resultados para Quality engineering

em Indian Institute of Science - Bangalore - Índia


Relevância:

40.00% 40.00%

Publicador:

Resumo:

It is well known that protein crystallizability can be influenced by site-directed mutagenesis of residues on the molecular surface of proteins, indicating that the intermolecular interactions in crystal-packing regions may play a crucial role in the structural regularity at atomic resolution of protein crystals. Here, a systematic examination was made of the improvement in the diffraction resolution of protein crystals on introducing a single mutation of a crystal-packing residue in order to provide more favourable packing interactions, using diphthine synthase from Pyrococcus horikoshii OT3 as a model system. All of a total of 21 designed mutants at 13 different crystal-packing residues yielded almost isomorphous crystals from the same crystallization conditions as those used for the wild-type crystals, which diffracted X-rays to 2.1 angstrom resolution. Of the 21 mutants, eight provided crystals with an improved resolution of 1.8 angstrom or better. Thus, it has been clarified that crystal quality can be improved by introducing a suitable single mutation of a crystal-packing residue. In the improved crystals, more intimate crystal-packing interactions than those in the wild-type crystal are observed. Notably, the mutants K49R and T146R yielded crystals with outstandingly improved resolutions of 1.5 and 1.6 angstrom, respectively, in which a large-scale rearrangement of packing interactions was unexpectedly observed despite the retention of the same isomorphous crystal form. In contrast, the mutants that provided results that were in good agreement with the designed putative structures tended to achieve only moderate improvements in resolution of up to 1.75 angstrom. These results suggest a difficulty in the rational prediction of highly effective mutations in crystal engineering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a machine learning approach to measure the visual quality of JPEG-coded images. The features for predicting the perceived image quality are extracted by considering key human visual sensitivity (HVS) factors such as edge amplitude, edge length, background activity and background luminance. Image quality assessment involves estimating the functional relationship between HVS features and subjective test scores. The quality of the compressed images are obtained without referring to their original images ('No Reference' metric). Here, the problem of quality estimation is transformed to a classification problem and solved using extreme learning machine (ELM) algorithm. In ELM, the input weights and the bias values are randomly chosen and the output weights are analytically calculated. The generalization performance of the ELM algorithm for classification problems with imbalance in the number of samples per quality class depends critically on the input weights and the bias values. Hence, we propose two schemes, namely the k-fold selection scheme (KS-ELM) and the real-coded genetic algorithm (RCGA-ELM) to select the input weights and the bias values such that the generalization performance of the classifier is a maximum. Results indicate that the proposed schemes significantly improve the performance of ELM classifier under imbalance condition for image quality assessment. The experimental results prove that the estimated visual quality of the proposed RCGA-ELM emulates the mean opinion score very well. The experimental results are compared with the existing JPEG no-reference image quality metric and full-reference structural similarity image quality metric.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fuzzy waste-load allocation model, FWLAM, is developed for water quality management of a river system using fuzzy multiple-objective optimization. An important feature of this model is its capability to incorporate the aspirations and conflicting objectives of the pollution control agency and dischargers. The vagueness associated with specifying the water quality criteria and fraction removal levels is modeled in a fuzzy framework. The goals related to the pollution control agency and dischargers are expressed as fuzzy sets. The membership functions of these fuzzy sets are considered to represent the variation of satisfaction levels of the pollution control agency and dischargers in attaining their respective goals. Two formulations—namely, the MAX-MIN and MAX-BIAS formulations—are proposed for FWLAM. The MAX-MIN formulation maximizes the minimum satisfaction level in the system. The MAX-BIAS formulation maximizes a bias measure, giving a solution that favors the dischargers. Maximization of the bias measure attempts to keep the satisfaction levels of the dischargers away from the minimum satisfaction level and that of the pollution control agency close to the minimum satisfaction level. Most of the conventional water quality management models use waste treatment cost curves that are uncertain and nonlinear. Unlike such models, FWLAM avoids the use of cost curves. Further, the model provides the flexibility for the pollution control agency and dischargers to specify their aspirations independently.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Uncertainty plays an important role in water quality management problems. The major sources of uncertainty in a water quality management problem are the random nature of hydrologic variables and imprecision (fuzziness) associated with goals of the dischargers and pollution control agencies (PCA). Many Waste Load Allocation (WLA)problems are solved by considering these two sources of uncertainty. Apart from randomness and fuzziness, missing data in the time series of a hydrologic variable may result in additional uncertainty due to partial ignorance. These uncertainties render the input parameters as imprecise parameters in water quality decision making. In this paper an Imprecise Fuzzy Waste Load Allocation Model (IFWLAM) is developed for water quality management of a river system subject to uncertainty arising from partial ignorance. In a WLA problem, both randomness and imprecision can be addressed simultaneously by fuzzy risk of low water quality. A methodology is developed for the computation of imprecise fuzzy risk of low water quality, when the parameters are characterized by uncertainty due to partial ignorance. A Monte-Carlo simulation is performed to evaluate the imprecise fuzzy risk of low water quality by considering the input variables as imprecise. Fuzzy multiobjective optimization is used to formulate the multiobjective model. The model developed is based on a fuzzy multiobjective optimization problem with max-min as the operator. This usually does not result in a unique solution but gives multiple solutions. Two optimization models are developed to capture all the decision alternatives or multiple solutions. The objective of the two optimization models is to obtain a range of fractional removal levels for the dischargers, such that the resultant fuzzy risk will be within acceptable limits. Specification of a range for fractional removal levels enhances flexibility in decision making. The methodology is demonstrated with a case study of the Tunga-Bhadra river system in India.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Chips were produced by orthogonal Cutting of cast pure magnesium billet with three different tool rake angles viz., -15 degrees, -5 degrees and +15 degrees on a lathe. Chip consolidation by solid state recycling technique involved cold compaction followed by hot extrusion. The extruded products were characterized for microstructure and mechanical properties. Chip-consolidated products from -15 degrees rake angle tools showed 19% increase in tensile strength, 60% reduction ingrain size and 12% increase in hardness compared to +15 degrees rake chip-consolidated product indicating better chip bonding and grain refinement. Microstructure of the fracture specimen Supports the abovefinding. On the overall, the present work high lights the importance of tool take angle in determining the quality of the chip-consolidated products. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a growing and pruning radial basis function based no-reference (NR) image quality model for JPEG-coded images. The quality of the images are estimated without referring to their original images. The features for predicting the perceived image quality are extracted by considering key human visual sensitivity factors such as edge amplitude, edge length, background activity and background luminance. Image quality estimation involves computation of functional relationship between HVS features and subjective test scores. Here, the problem of quality estimation is transformed to a function approximation problem and solved using GAP-RBF network. GAP-RBF network uses sequential learning algorithm to approximate the functional relationship. The computational complexity and memory requirement are less in GAP-RBF algorithm compared to other batch learning algorithms. Also, the GAP-RBF algorithm finds a compact image quality model and does not require retraining when the new image samples are presented. Experimental results prove that the GAP-RBF image quality model does emulate the mean opinion score (MOS). The subjective test results of the proposed metric are compared with JPEG no-reference image quality index as well as full-reference structural similarity image quality index and it is observed to outperform both.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Deterministic models have been widely used to predict water quality in distribution systems, but their calibration requires extensive and accurate data sets for numerous parameters. In this study, alternative data-driven modeling approaches based on artificial neural networks (ANNs) were used to predict temporal variations of two important characteristics of water quality chlorine residual and biomass concentrations. The authors considered three types of ANN algorithms. Of these, the Levenberg-Marquardt algorithm provided the best results in predicting residual chlorine and biomass with error-free and ``noisy'' data. The ANN models developed here can generate water quality scenarios of piped systems in real time to help utilities determine weak points of low chlorine residual and high biomass concentration and select optimum remedial strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Polypyrrole (PPy) - multiwalled carbonnanotubes (MWCNT) nanocomposites with various MWCNT loading were prepared by in situ inversion emulsion polymerization technique. High loading of the nano filler were evaluated because of available inherent high interface area for charge separation in the nanocomposites. Solution processing of these conducting polymer nanocomposites is difficult because, most of them are insoluble in organic solvents. Device quality films of these composites were prepared by using pulsed laser deposition techniques (PLD). Comparative study of X-ray photoelectron spectroscopy (XPS) of bulk and film show that there is no chemical modification of polymer on ablation with laser. TEM images indicate PPy layer on MWCNT surface. SEM micrographs indicate that the MWCNT's are distributed throughout the film. It was observed that MWCNT in the composite held together by polymer matrix. Further more MWCNT diameter does not change from bulk to film indicating that the polymer layer remains intact during ablation. Even for very high loadings (80 wt.% of MWCNT's) of nanocomposites device quality films were fabricated, indicating laser ablation is a suitable technique for fabrication of device quality films. Conductivity of both bulk and films were measured using collinear four point probe setup. It was found that overall conductivity increases with increase in MWCNT loading. Comparative study of thickness with conductivity indicates that maximum conductivity was observed around 0.2 mu m. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Methodologies are presented for minimization of risk in a river water quality management problem. A risk minimization model is developed to minimize the risk of low water quality along a river in the face of conflict among various stake holders. The model consists of three parts: a water quality simulation model, a risk evaluation model with uncertainty analysis and an optimization model. Sensitivity analysis, First Order Reliability Analysis (FORA) and Monte-Carlo simulations are performed to evaluate the fuzzy risk of low water quality. Fuzzy multiobjective programming is used to formulate the multiobjective model. Probabilistic Global Search Laussane (PGSL), a global search algorithm developed recently, is used for solving the resulting non-linear optimization problem. The algorithm is based on the assumption that better sets of points are more likely to be found in the neighborhood of good sets of points, therefore intensifying the search in the regions that contain good solutions. Another model is developed for risk minimization, which deals with only the moments of the generated probability density functions of the water quality indicators. Suitable skewness values of water quality indicators, which lead to low fuzzy risk are identified. Results of the models are compared with the results of a deterministic fuzzy waste load allocation model (FWLAM), when methodologies are applied to the case study of Tunga-Bhadra river system in southern India, with a steady state BOD-DO model. The fractional removal levels resulting from the risk minimization model are slightly higher, but result in a significant reduction in risk of low water quality. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An attempt is made to discuss in brief the current philosophy and trends in quality assurance through nondestructive testing. The techniques currently in use and those being developed for newer and advanced materials such as composites are reviewed. 27 ref.--AA

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many next-generation distributed applications, such as grid computing, require a single source to communicate with a group of destinations. Traditionally, such applications are implemented using multicast communication. A typical multicast session requires creating the shortest-path tree to a fixed number of destinations. The fundamental issue in multicasting data to a fixed set of destinations is receiver blocking. If one of the destinations is not reachable, the entire multicast request (say, grid task request) may fail. Manycasting is a generalized variation of multicasting that provides the freedom to choose the best subset of destinations from a larger set of candidate destinations. We propose an impairment-aware algorithm to provide manycasting service in the optical layer, specifically OBS. We compare the performance of our proposed manycasting algorithm with traditional multicasting and multicast with over provisioning. Our results show a significant improvement in the blocking probability by implementing optical-layer manycasting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a method of sharing power/energy between multiple sources and multiple loads using an integrated magnetic circuit as a junction between sources and sinks. It also presents a particular use of the magnetic circuit as an ac power supply, delivering sinusoidal voltage to load irrespective of the presence of the grid, taking only active power from the grid. The proposed magnetic circuit is a three-energy-port unit, viz.: 1) power/energy from grid; 2) power energy from battery-inverter unit; and 3) power/energy delivery to the load in its particular application as quality ac power supply (QPS). The product provides sinusoidal regulated output voltage, input power-factor correction, electrical isolation between the sources and loads, low battery voltage, and control simplicity. Unlike conventional series-shunt-compensated uninterruptible power supply topologies with low battery voltage, the isolation is provided using a single magnetic circuit that results in a smaller size and lower cost. The circuit operating principles and analysis, as well as simulation and experimental results, are presented for this QPS.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Frequency-domain scheduling and rate adaptation have helped next generation orthogonal frequency division multiple access (OFDMA) based wireless cellular systems such as Long Term Evolution (LTE) achieve significantly higher spectral efficiencies. To overcome the severe uplink feedback bandwidth constraints, LTE uses several techniques to reduce the feedback required by a frequency-domain scheduler about the channel state information of all subcarriers of all users. In this paper, we analyze the throughput achieved by the User Selected Subband feedback scheme of LTE. In it, a user feeds back only the indices of the best M subbands and a single 4-bit estimate of the average rate achievable over all selected M subbands. In addition, we compare the performance with the subband-level feedback scheme of LTE, and highlight the role of the scheduler by comparing the performances of the unfair greedy scheduler and the proportional fair (PF) scheduler. Our analysis sheds several insights into the working of the feedback reduction techniques used in LTE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the problem of wireless channel allocation to multiple users. A slot is given to a user with a highest metric (e.g., channel gain) in that slot. The scheduler may not know the channel states of all the users at the beginning of each slot. In this scenario opportunistic splitting is an attractive solution. However this algorithm requires that the metrics of different users form independent, identically distributed (iid) sequences with same distribution and that their distribution and number be known to the scheduler. This limits the usefulness of opportunistic splitting. In this paper we develop a parametric version of this algorithm. The optimal parameters of the algorithm are learnt online through a stochastic approximation scheme. Our algorithm does not require the metrics of different users to have the same distribution. The statistics of these metrics and the number of users can be unknown and also vary with time. Each metric sequence can be Markov. We prove the convergence of the algorithm and show its utility by scheduling the channel to maximize its throughput while satisfying some fairness and/or quality of service constraints.