18 resultados para R-Statistical computing

em Indian Institute of Science - Bangalore - Índia


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using surface charts at 0330GMT, the movement df the monsoon trough during the months June to September 1990 al two fixed longitudes, namely 79 degrees E and 85 degrees E, is studied. The probability distribution of trough position shows that the median, mean and mode occur at progressively more northern latitudes, especially at 85 degrees E, with a pronounced mode that is close to the northern-most limit reached by the trough. A spectral analysis of the fluctuating latitudinal position of the trough is carried out using FFT and the Maximum Entropy Method (MEM). Both methods show significant peaks around 7.5 and 2.6 days, and a less significant one around 40-50 days. The two peaks at the shorter period are more prominent at the eastern longitude. MEM shows an additional peak around 15 days. A study of the weather systems that occurred during the season shows them to have a duration around 3 days and an interval between systems of around 9 days, suggesting a possible correlation with the dominant short periods observed in the spectrum of trough position.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The charge at which adsorption of orgamc compounds attains a maximum ( \sigma MAX M) at an electrochenucal interface is analysed using several multi-state models in a hierarchical manner The analysis is based on statistical mechamcal results for the following models (A) two-state site parity, (B) two-state muhl-slte, and (C) three-state site parity The coulombic interactions due to permanent and reduced dipole effects (using mean field approximation), electrostatic field effects and specific substrate interactions have been taken into account. The simplest model in the hierarchy (two-state site parity) yields the exphcit dependence of ( \sigma MAX M) on the permanent dipole moment, polarizability of the solvent and the adsorbate, lattice spacing, effective coordination number, etc Other models in the baerarchy bring to hght the influence of the solvent structure and the role of substrate interactions, etc As a result of this approach, the "composition" of oM.x m terms of the fundamental molecular constants becomes clear. With a view to use these molecular results to maxamum advantage, the derived results for ( \sigma MAX M) have been converted into those involving experimentally observable parameters lake Co, C 1, E N, etc Wherever possible, some of the earlier phenomenologlcal relations reported for ( \sigma MAX M), notably by Parsons, Damaskm and Frumkln, and Trasattl, are shown to have a certain molecular basis, vlz a simple two-state sate panty model.As a corollary to the hxerarcbacal modelling, \sigma MAX M and the potential corresponding to at (Emax) are shown to be constants independent of 0max or Corg for all models The lmphcatlon of our analysis f o r OmMa x with respect to that predicted by the generalized surface layer equation (which postulates Om~ and Ema x varlaUon with 0) is discussed in detail Finally we discuss an passing o M. and the electrosorptlon valency an this context.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The recently introduced generalized pencil of Sudarshan which gives an exact ray picture of wave optics is analysed in some situations of interest to wave optics. A relationship between ray dispersion and statistical inhomogeneity of the field is obtained. A paraxial approximation which preserves the rectilinear propagation character of the generalized pencils is presented. Under this approximation the pencils can be computed directly from the field conditions on a plane, without the necessity to compute the cross-spectral density function in the entire space as an intermediate quantity. The paraxial results are illustrated with examples. The pencils are shown to exhibit an interesting scaling behaviour in the far-zone. This scaling leads to a natural generalization of the Fraunhofer range criterion and of the classical van Cittert-Zernike theorem to planar sources of arbitrary state of coherence. The recently derived results of radiometry with partially coherent sources are shown to be simple consequences of this scaling.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The existing internet computing resource, Biomolecules Segment Display Device (BSDD), has been updated with several additional useful features. An advanced option is provided to superpose the structural motifs obtained from a search on the Protein Data Bank (PDB) in order to see if the three-dimensional structures adopted by identical or similar sequence motifs are the same. Furthermore, the options to display structural aspects like inter- and intra-molecular interactions, ion-pairs, disulphide bonds, etc. have been provided.The updated resource is interfaced with an up-to-date copy of the public domain PDB as well as 25 and 90% non-redundant protein structures. Further, users can upload the three-dimensional atomic coordinates (PDB format) from the client machine. A free molecular graphics program, JMol, is interfaced with it to display the three-dimensional structures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A method is developed for demonstrating how solitons with some internal periodic motion may emerge as elementary excitations in the statistical mechanics of field systems. The procedure is demonstrated in the context of complex scalar fields which can, for appropriate choices of the Lagrangian, yield charge-carrying solitons with such internal motion. The derivation uses the techniques of the steepest-descent method for functional integrals. It is shown that, despite the constraint of some fixed total charge, a gaslike excitation of such charged solitons does emerge.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This Paper deals with the analysis of liquid limit of soils, an inferential parameter of universal acceptance. It has been undertaken primarily to re-examine one-point methods of determination of liquid limit water contents. It has been shown by basic characteristics of soils and associated physico-chemical factors that critical shear strengths at liquid limit water contents arise out of force field equilibrium and are independent of soil type. This leads to the formation of a scientific base for liquid limit determination by one-point methods, which hitherto was formulated purely on statistical analysis of data. Available methods (Norman, 1959; Karlsson, 1961; Clayton & Jukes, 1978) of one-point liquid limit determination have been critically re-examined. A simple one-point cone penetrometer method of computing liquid limit has been suggested and compared with other methods. Experimental data of Sherwood & Ryley (1970) have been employed for comparison of different cone penetration methods. Results indicate that, apart from mere statistical considerations, one-point methods have a strong scientific base on the uniqueness of modified flow line irrespective of soil type. Normalized flow line is obtained by normalization of water contents by liquid limit values thereby nullifying the effects of surface areas and associated physico-chemical factors that are otherwise reflected in different responses at macrolevel.Cet article traite de l'analyse de la limite de liquidité des sols, paramètre déductif universellement accepté. Cette analyse a été entreprise en premier lieu pour ré-examiner les méthodes à un point destinées à la détermination de la teneur en eau à la limite de liquidité. Il a été démontré par les caractéristiques fondamentales de sols et par des facteurs physico-chimiques associés que les résistances critiques à la rupture au cisaillement pour des teneurs en eau à la limite de liquidité résultent de l'équilibre des champs de forces et sont indépendantes du type de sol concerné. On peut donc constituer une base scientifique pour la détermination de la limite de liquidité par des méthodes à un point lesquelles, jusqu'alors, n'avaient été formulées que sur la base d'une analyse statistique des données. Les méthodes dont on dispose (Norman, 1959; Karlsson, 1961; Clayton & Jukes, 1978) pour la détermination de la limite de liquidité à un point font l'objet d'un ré-examen critique. Une simple méthode d'analyse à un point à l'aide d'un pénétromètre à cône pour le calcul de la limite de liquidité a été suggérée et comparée à d'autres méthodes. Les données expérimentales de Sherwood & Ryley (1970) ont été utilisées en vue de comparer différentes méthodes de pénétration par cône. En plus de considérations d'ordre purement statistque, les résultats montrent que les méthodes de détermination à un point constituent une base scientifique solide en raison du caractère unique de la ligne de courant modifiée, quel que soit le type de sol La ligne de courant normalisée est obtenue par la normalisation de la teneur en eau en faisant appel à des valeurs de limite de liquidité pour, de cette manière, annuler les effets des surfaces et des facteurs physico-chimiques associés qui sans cela se manifesteraient dans les différentes réponses au niveau macro.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A generalized technique is proposed for modeling the effects of process variations on dynamic power by directly relating the variations in process parameters to variations in dynamic power of a digital circuit. The dynamic power of a 2-input NAND gate is characterized by mixed-mode simulations, to be used as a library element for 65mn gate length technology. The proposed methodology is demonstrated with a multiplier circuit built using the NAND gate library, by characterizing its dynamic power through Monte Carlo analysis. The statistical technique of Response. Surface Methodology (RSM) using Design of Experiments (DOE) and Least Squares Method (LSM), are employed to generate a "hybrid model" for gate power to account for simultaneous variations in multiple process parameters. We demonstrate that our hybrid model based statistical design approach results in considerable savings in the power budget of low power CMOS designs with an error of less than 1%, with significant reductions in uncertainty by atleast 6X on a normalized basis, against worst case design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

For structured-light scanners, the projective geometry between a projector-camera pair is identical to that of a camera-camera pair. Consequently, in conjunction with calibration, a variety of geometric relations are available for three-dimensional Euclidean reconstruction. In this paper, we use projector-camera epipolar properties and the projective invariance of the cross-ratio to solve for 3D geometry. A key contribution of our approach is the use of homographies induced by reference planes, along with a calibrated camera, resulting in a simple parametric representation for projector and system calibration. Compared to existing solutions that require an elaborate calibration process, our method is simple while ensuring geometric consistency. Our formulation using the invariance of the cross-ratio is also extensible to multiple estimates of 3D geometry that can be analysed in a statistical sense. The performance of our system is demonstrated on some cultural artifacts and geometric surfaces.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A fuzzy system is developed using a linearized performance model of the gas turbine engine for performing gas turbine fault isolation from noisy measurements. By using a priori information about measurement uncertainties and through design variable linking, the design of the fuzzy system is posed as an optimization problem with low number of design variables which can be solved using the genetic algorithm in considerably low amount of computer time. The faults modeled are module faults in five modules: fan, low pressure compressor, high pressure compressor, high pressure turbine and low pressure turbine. The measurements used are deviations in exhaust gas temperature, low rotor speed, high rotor speed and fuel flow from a base line 'good engine'. The genetic fuzzy system (GFS) allows rapid development of the rule base if the fault signatures and measurement uncertainties change which happens for different engines and airlines. In addition, the genetic fuzzy system reduces the human effort needed in the trial and error process used to design the fuzzy system and makes the development of such a system easier and faster. A radial basis function neural network (RBFNN) is also used to preprocess the measurements before fault isolation. The RBFNN shows significant noise reduction and when combined with the GFS leads to a diagnostic system that is highly robust to the presence of noise in data. Showing the advantage of using a soft computing approach for gas turbine diagnostics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have developed a theory for an electrochemical way of measuring the statistical properties of a nonfractally rough electrode. We obtained the expression for the current transient on a rough electrode which shows three times regions: short and long time limits and the transition region between them. The expressions for these time ranges are exploited to extract morphological information about the surface roughness. In the short and long time regimes, we extract information regarding various morphological features like the roughness factor, average roughness, curvature, correlation length, dimensionality of roughness, and polynomial approximation for the correlation function. The formulas for the surface structure factors (the measure of surface roughness) of rough surfaces in terms of measured reversible and diffusion-limited current transients are also obtained. Finally, we explore the feasibility of making such measurements.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present robust semi-blind (SB) algorithms for the estimation of beamforming vectors for multiple-input multiple-output wireless communication. The transmitted symbol block is assumed to comprise of a known sequence of training (pilot) symbols followed by information bearing blind (unknown) data symbols. Analytical expressions are derived for the robust SB estimators of the MIMO receive and transmit beamforming vectors. These robust SB estimators employ a preliminary estimate obtained from the pilot symbol sequence and leverage the second-order statistical information from the blind data symbols. We employ the theory of Lagrangian duality to derive the robust estimate of the receive beamforming vector by maximizing an inner product, while constraining the channel estimate to lie in a confidence sphere centered at the initial pilot estimate. Two different schemes are then proposed for computing the robust estimate of the MIMO transmit beamforming vector. Simulation results presented in the end illustrate the superior performance of the robust SB estimators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The statistical thermodynamics of adsorption in caged zeolites is developed by treating the zeolite as an ensemble of M identical cages or subsystems. Within each cage adsorption is assumed to occur onto a lattice of n identical sites. Expressions for the average occupancy per cage are obtained by minimizing the Helmholtz free energy in the canonical ensemble subject to the constraints of constant M and constant number of adsorbates N. Adsorbate-adsorbate interactions in the Brag-Williams or mean field approximation are treated in two ways. The local mean field approximation (LMFA) is based on the local cage occupancy and the global mean field approximation (GMFA) is based on the average coverage of the ensemble. The GMFA is shown to be equivalent in formulation to treating the zeolite as a collection of interacting single site subsystems. In contrast, the treatment in the LMFA retains the description of the zeolite as an ensemble of identical cages, whose thermodynamic properties are conveniently derived in the grand canonical ensemble. For a z coordinated lattice within the zeolite cage, with epsilon(aa) as the adsorbate-adsorbate interaction parameter, the comparisons for different values of epsilon(aa)(*)=epsilon(aa)z/2kT, and number of sites per cage, n, illustrate that for -1 0. We compare the isotherms predicted with the LMFA with previous GMFA predictions [K. G. Ayappa, C. R. Kamala, and T. A. Abinandanan, J. Chem. Phys. 110, 8714 (1999)] (which incorporates both the site volume reduction and a coverage-dependent epsilon(aa)) for xenon and methane in zeolite NaA. In all cases the predicted isotherms are very similar, with the exception of a small steplike feature present in the LMFA for xenon at higher coverages. (C) 1999 American Institute of Physics. [S0021-9606(99)70333-8].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Fragment Finder 2.0 is a web-based interactive computing server which can be used to retrieve structurally similar protein fragments from 25 and 90% nonredundant data sets. The computing server identifies structurally similar fragments using the protein backbone C alpha angles. In addition, the identified fragments can be superimposed using either of the two structural superposition programs, STAMP and PROFIT, provided in the server. The freely available Java plug-in Jmol has been interfaced with the server for the visualization of the query and superposed fragments. The server is the updated version of a previously developed search engine and employs an in-house-developed fast pattern matching algorithm. This server can be accessed freely over the World Wide Web through the URL http://cluster.physics.iisc.ernet.in/ff/.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Electrical failure of insulation is known to be an extremal random process wherein nominally identical pro-rated specimens of equipment insulation, at constant stress fail at inordinately different times even under laboratory test conditions. In order to be able to estimate the life of power equipment, it is necessary to run long duration ageing experiments under accelerated stresses, to acquire and analyze insulation specific failure data. In the present work, Resin Impregnated Paper (RIP) a relatively new insulation system of choice used in transformer bushings, is taken as an example. The failure data has been processed using proven statistical methods, both graphical and analytical. The physical model governing insulation failure at constant accelerated stress has been assumed to be based on temperature dependent inverse power law model.