32 resultados para Empirical facts

em Indian Institute of Science - Bangalore - Índia


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The torsional potential functions Vt(phi) and Vt(psi) around single bonds N--C alpha and C alpha--C, which can be used in conformational studies of oligopeptides, polypeptides and proteins, have been derived, using crystal structure data of 22 globular proteins, fitting the observed distribution in the (phi, psi)-plane with the value of Vtot(phi, psi), using the Boltzmann distribution. The averaged torsional potential functions, obtained from various amino acid residues in L-configuration, are Vt(phi) = 1.0 cos (phi + 60 degrees); Vt(psi) = 0.5 cos (psi + 60 degrees) - 1.0 cos (2 psi + 30 degrees) - 0.5 cos (3 psi + 30 degrees). The dipeptide energy maps Vtot(phi, psi) obtained using these functions, instead of the normally accepted torsional functions, were found to explain various observations, such as the absence of the left-handed alpha helix and the C7 conformation, and the relatively high density of points near the line psi = 0 degrees. These functions derived from observational data on protein structures, will, it is hoped, explain various previously unexplained facts in polypeptide conformation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The torsional potential functions Vt(φ) and Vt(ψ) around single bonds N–Cα and Cα-C, which can be used in conformational studies of oligopeptides, polypeptides and proteins, have been derived, using crystal structure data of 22 globular proteins, fitting the observed distribution in the (φ, ψ)-plane with the value of Vtot(φ, ψ), using the Boltzmann distribution. The averaged torsional potential functions, obtained from various amino acid residues in l-configuration, are Vt(φ) = – 1.0 cos (φ + 60°); Vt(ψ) = – 0.5 cos (ψ + 60°) – 1.0 cos (2ψ + 30°) – 0.5 cos (3ψ + 30°). The dipeptide energy maps Vtot(φ, ψ) obtained using these functions, instead of the normally accepted torsional functions, were found to explain various observations, such as the absence of the left-handed alpha helix and the C7 conformation, and the relatively high density of points near the line ψ = 0°. These functions, derived from observational data on protein structures, will, it is hoped, explain various previously unexplained facts in polypeptide conformation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A semi-empirical model is presented for describing the interionic interactions in molten salts using the experimentally available structure data. An extension of Bertaut's method of non-overlapping charges is used to estimate the electrostatic interaction energy in ionic melts. It is shown, in agreement with earlier computer simulation studies, that this energy increases when an ionic salt melts. The repulsion between ions is described using a compressible ion theory which uses structure-independent parameters. The van der Waals interactions and the thermal free energy are also included in the total energy, which is minimised with respect to isostructural volume variations to calculate the equilibrium density. Detailed results are presented for three molten systems, NaCl, CaCl2 and ZnCl2, and are shown to be in satisfactory agreement with experiments. With reliable structural data now being reported for several other molten salts, the present study gains relevance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

EEG recordings are often contaminated with ocular artifacts such as eye blinks and eye movements. These artifacts may obscure underlying brain activity in the electroencephalogram (EEG) data and make the analysis of the data difficult. In this paper, we explore the use of empirical mode decomposition (EMD) based filtering technique to correct the eye blinks and eye movementartifacts in single channel EEG data. In this method, the single channel EEG data containing ocular artifact is segmented such that the artifact in each of the segment is considered as some type of slowly varying trend in the dataand the EMD is used to remove the trend. The filtering is done using partial reconstruction from components of the decomposition. The method is completely data dependent and hence adaptive and nonlinear. Experimental results are provided to check the applicability of the method on real EEG data and the results are quantified using power spectral density (PSD) as a measure. The method has given fairlygood results and does not make use of any preknowledge of artifacts or the EEG data used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Inventory management (IM) has a decisive role in the enhancement of manufacturing industry's competitiveness. Therefore, major manufacturing industries are following IM practices with the intention of improving their performance. However, the effort to introduce IM in SMEs is very limited due to lack of initiation, expertise, and financial constraints. This paper aims to provide a guideline for entrepreneurs in enhancing their IM performance, as it presents the results of a survey based study carried out for machine tool Small and Medium Enterprises (SMEs) in Bangalore. Having established the significance of inventory as an input, we probed the relationship between IM performance and economic performance of these SMEs. To the extent possible all the factors of production and performance indicators were deliberately considered in pure economic terms. All economic performance indicators adopted seem to have a positive and significant association with IM performance in SMEs. On the whole, we found that SMEs which are IM efficient are likely to perform better on the economic front also and experience higher returns to scale.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The electrical conduction in insulating materials is a complex process and several theories have been suggested in the literature. Many phenomenological empirical models are in use in the DC cable literature. However, the impact of using different models for cable insulation has not been investigated until now, but for the claims of relative accuracy. The steady state electric field in the DC cable insulation is known to be a strong function of DC conductivity. The DC conductivity, in turn, is a complex function of electric field and temperature. As a result, under certain conditions, the stress at cable screen is higher than that at the conductor boundary. The paper presents detailed investigations on using different empirical conductivity models suggested in the literature for HV DC cable applications. It has been expressly shown that certain models give rise to erroneous results in electric field and temperature computations. It is pointed out that the use of these models in the design or evaluation of cables will lead to errors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Supercritical processes are gaining importance in the last few years in the food, environmental and pharmaceutical product processing. The design of any supercritical process needs accurate experimental data on solubilities of solids in the supercritical fluids (SCFs). The empirical equations are quite successful in correlating the solubilities of solid compounds in SCF both in the presence and absence of cosolvents. In this work, existing solvate complex models are discussed and a new set of empirical equations is proposed. These equations correlate the solubilities of solids in supercritical carbon dioxide (both in the presence and absence of cosolvents) as a function of temperature, density of supercritical carbon dioxide and the mole fraction of cosolvent. The accuracy of the proposed models was evaluated by correlating 15 binary and 18 ternary systems. The proposed models provided the best overall correlations. (C) 2009 Elsevier BA/. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper deals with the development of simplified semi-empirical relations for the prediction of residual velocities of small calibre projectiles impacting on mild steel target plates, normally or at an angle, and the ballistic limits for such plates. It has been shown, for several impact cases for which test results on perforation of mild steel plates are available, that most of the existing semi-empirical relations which are applicable only to normal projectile impact do not yield satisfactory estimations of residual velocity. Furthermore, it is difficult to quantify some of the empirical parameters present in these relations for a given problem. With an eye towards simplicity and ease of use, two new regression-based relations employing standard material parameters have been discussed here for predicting residual velocity and ballistic limit for both normal and oblique impact. The latter expressions differ in terms of usage of quasi-static or strain rate-dependent average plate material strength. Residual velocities yielded by the present semi-empirical models compare well with the experimental results. Additionally, ballistic limits from these relations show close correlation with the corresponding finite element-based predictions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes the use of empirical modeling techniques for building microarchitecture sensitive models for compiler optimizations. The models we build relate program performance to settings of compiler optimization flags, associated heuristics and key microarchitectural parameters. Unlike traditional analytical modeling methods, this relationship is learned entirely from data obtained by measuring performance at a small number of carefully selected compiler/microarchitecture configurations. We evaluate three different learning techniques in this context viz. linear regression, adaptive regression splines and radial basis function networks. We use the generated models to a) predict program performance at arbitrary compiler/microarchitecture configurations, b) quantify the significance of complex interactions between optimizations and the microarchitecture, and c) efficiently search for'optimal' settings of optimization flags and heuristics for any given microarchitectural configuration. Our evaluation using benchmarks from the SPEC CPU2000 suits suggests that accurate models (< 5% average error in prediction) can be generated using a reasonable number of simulations. We also find that using compiler settings prescribed by a model-based search can improve program performance by as much as 19% (with an average of 9.5%) over highly optimized binaries.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A careful comparison of the distribution in the (R, θ)-plane of all NH ... O hydrogen bonds with that for bonds between neutral NH and neutral C=O groups indicated that the latter has a larger mean R and a wider range of θ and that the distribution was also broader than for the average case. Therefore, the potential function developed earlier for an average NH ... O hydrogen bond was modified to suit the peptide case. A three-parameter expression of the form {Mathematical expression}, with △ = R - Rmin, was found to be satisfactory. By comparing the theoretically expected distribution in R and θ with observed data (although limited), the best values were found to be p1 = 25, p3 = - 2 and q1 = 1 × 10-3, with Rmin = 2·95 Å and Vmin = - 4·5 kcal/mole. The procedure for obtaining a smooth transition from Vhb to the non-bonded potential Vnb for large R and θ is described, along with a flow chart useful for programming the formulae. Calculated values of ΔH, the enthalpy of formation of the hydrogen bond, using this function are in reasonable agreement with observation. When the atoms involved in the hydrogen bond occur in a five-membered ring as in the sequence[Figure not available: see fulltext.] a different formula for the potential function is needed, which is of the form Vhb = Vmin +p1△2 +q1x2 where x = θ - 50° for θ ≥ 50°, with p1 = 15, q1 = 0·002, Rmin = 2· Å and Vmin = - 2·5 kcal/mole. © 1971 Indian Academy of Sciences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An algorithm for optimal allocation of reactive power in AC/DC system using FACTs devices, with an objective of improving the voltage profile and also voltage stability of the system has been presented. The technique attempts to utilize fully the reactive power sources in the system to improve the voltage stability and profile as well as meeting the reactive power requirements at the AC-DC terminals to facilitate the smooth operation of DC links. The method involves successive solution of steady-state power flows and optimization of reactive power control variables with Unified Power Flow Controller (UPFC) using linear programming technique. The proposed method has been tested on a real life equivalent 96-bus AC and a two terminal DC system under normal and contingency conditions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A technique based on empirical orthogonal functions is used to estimate hydrologic time-series variables at ungaged locations. The technique is applied to estimate daily and monthly rainfall, temperature and runoff values. The accuracy of the method is tested by application to locations where data are available. The second-order characteristics of the estimated data are compared with those of the observed data. The results indicate that the method is quick and accurate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper, the performance of distance relays when applied to transmission system equipped with shunt FACTS device, Static Synchronous Compensator (STATCOM) is described. The aim of the proposed study is to evaluate the performance of distance relays when STATCOM is incorporated at the mid point of transmission lines for voltage control. A detailed model of STATCOM and its control strategy is presented. The presence of these devices significantly affects apparent impedance seen by the distance relays due to their rapid response to different power system configurations. The distance relay is evaluated for different loading conditions and for different fault locations. The faults are created during various pre-fault loading conditions. The studies are performed on 400KV and 132KV systems and the results are presented. Simulation studies are carried out using transient simulation software, PSCAD/EMTDC.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Automated synthesis of mechanical designs is an important step towards the development of an intelligent CAD system. Research into methods for supporting conceptual design using automated synthesis has attracted much attention in the past decades. The research work presented here is based on the processes of synthesizing multiple state mechanical devices carried out individually by ten engineering designers. The designers are asked to think aloud, while carrying out the synthesis. The ten design synthesis processes are video recorded, and the records are transcribed and coded for identifying activities occurring in the synthesis processes, as well as for identifying the inputs to and outputs from the activities. A mathematical representation for specifying multi-state design task is proposed. Further, a descriptive model capturing all the ten synthesis processes is developed and presented in this paper. This will be used to identify the outstanding issues to be resolved before a system for supporting design synthesis of multiple state mechanical devices that is capable of creating a comprehensive variety of solution alternatives could be developed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The research shown in this paper is to check whether a framework for designing: GEMS of SAPPhIRE as req-sol, developed earlier, can support in the designing of novel concepts. This is done by asking the questions: (a) Is there a relationship between the constructs of the framework and novelty? (b) If there is a relationship, what is the degree of this relationship? A hypothesis — an increase in the size and variety of ideas used while designing should enhance the variety of concepts produced, leading to an increase in the novelty of the concept space — is developed to explain the relationship between novelty and the constructs. Eight existing observational studies of designing sessions, each involving an individual designer solving a conceptual design problem by following a think aloud protocol are used for the analysis. The hypothesis is verified empirically using the observational studies. Results also show a strong correlation between novelty and the constructs of the framework; correlation value decreases as the abstraction level of the constructs reduces, signifying the importance of using constructs at higher abstraction levels especially for novelty.