235 resultados para Statistical parameters
Resumo:
An experimental programme based on statistical analysis was used for optimizing the reverse Rotation of silica from non-magnetic spiral preconcentrate of Kudremukh iron ore. Flotation of silica with amine and starch as the Rotation reagents was studied to estimate the optimum reagent levels at various mesh of grind. The experiments were first carried out using a two level three factor design. Analysis of the results showed that two parameters namely, the concentration level of the amine collector and the mesh of grind, were significant. Experiments based on an orthogonal design of the hexagonal type were then carried out to determine the effects of these two variables, on recovery and grade of the concentrate. Regression equations have been developed as models. Response contours have been plotted using the 'path of steepest ascent', maximum response has been optimized at 0.27 kg/ton of amine collector, 0.5 kg/ton of starch and mesh of grind of 48.7% passing 300 mesh to give a recovery of 83.43% of Fe in the concentrate containing 66.6% Fe and 2.17% SiO2.
Resumo:
The present investigation analyses the thermodynamic behaviour of the surfaces and adsorption as a function of temperature and composition in the Fe-S-O melts based on the Butler's equations. The calculated-values of the surface tensions exhibit an elevation or depression depending on the type of the added solute at a concentration which coincides with that already present in the system. Generally, the desorption of the solutes as a function of temperature results in an initial increase followed by a decrease in the values of the surface tension. The observations are analyzed based on the surface interaction parameters which are derived in the present research.
Resumo:
Perfect or even mediocre weather predictions over a long period are almost impossible because of the ultimate growth of a small initial error into a significant one. Even though the sensitivity of initial conditions limits the predictability in chaotic systems, an ensemble of prediction from different possible initial conditions and also a prediction algorithm capable of resolving the fine structure of the chaotic attractor can reduce the prediction uncertainty to some extent. All of the traditional chaotic prediction methods in hydrology are based on single optimum initial condition local models which can model the sudden divergence of the trajectories with different local functions. Conceptually, global models are ineffective in modeling the highly unstable structure of the chaotic attractor. This paper focuses on an ensemble prediction approach by reconstructing the phase space using different combinations of chaotic parameters, i.e., embedding dimension and delay time to quantify the uncertainty in initial conditions. The ensemble approach is implemented through a local learning wavelet network model with a global feed-forward neural network structure for the phase space prediction of chaotic streamflow series. Quantification of uncertainties in future predictions are done by creating an ensemble of predictions with wavelet network using a range of plausible embedding dimensions and delay times. The ensemble approach is proved to be 50% more efficient than the single prediction for both local approximation and wavelet network approaches. The wavelet network approach has proved to be 30%-50% more superior to the local approximation approach. Compared to the traditional local approximation approach with single initial condition, the total predictive uncertainty in the streamflow is reduced when modeled with ensemble wavelet networks for different lead times. Localization property of wavelets, utilizing different dilation and translation parameters, helps in capturing most of the statistical properties of the observed data. The need for taking into account all plausible initial conditions and also bringing together the characteristics of both local and global approaches to model the unstable yet ordered chaotic attractor of a hydrologic series is clearly demonstrated.
Resumo:
The statistical thermodynamics of adsorption in caged zeolites is developed by treating the zeolite as an ensemble of M identical cages or subsystems. Within each cage adsorption is assumed to occur onto a lattice of n identical sites. Expressions for the average occupancy per cage are obtained by minimizing the Helmholtz free energy in the canonical ensemble subject to the constraints of constant M and constant number of adsorbates N. Adsorbate-adsorbate interactions in the Brag-Williams or mean field approximation are treated in two ways. The local mean field approximation (LMFA) is based on the local cage occupancy and the global mean field approximation (GMFA) is based on the average coverage of the ensemble. The GMFA is shown to be equivalent in formulation to treating the zeolite as a collection of interacting single site subsystems. In contrast, the treatment in the LMFA retains the description of the zeolite as an ensemble of identical cages, whose thermodynamic properties are conveniently derived in the grand canonical ensemble. For a z coordinated lattice within the zeolite cage, with epsilon(aa) as the adsorbate-adsorbate interaction parameter, the comparisons for different values of epsilon(aa)(*)=epsilon(aa)z/2kT, and number of sites per cage, n, illustrate that for -1
Resumo:
This paper presents a general methodology for the synthesis of the external boundary of the workspaces of a planar manipulator with arbitrary topology. Both the desired workspace and the manipulator workspaces are identified by their boundaries and are treated as simple closed polygons. The paper introduces the concept of best match configuration and shows that the corresponding transformation can be obtained by using the concept of shape normalization available in image processing literature. Introduction of the concept of shape in workspace synthesis allows highly accurate synthesis with fewer numbers of design variables. This paper uses a new global property based vector representation for the shape of the workspaces which is computationally efficient because six out of the seven elements of this vector are obtained as a by-product of the shape normalization procedure. The synthesis of workspaces is formulated as an optimization problem where the distance between the shape vector of the desired workspace and that of the workspace of the manipulator at hand are minimized by changing the dimensional parameters of the manipulator. In view of the irregular nature of the error manifold, the statistical optimization procedure of simulated annealing has been used. A number of worked-out examples illustrate the generality and efficiency of the present method. (C) 1998 Elsevier Science Ltd. All rights reserved.
Resumo:
Mulberry fiber (Bivoltine) and non-mulberry fiber (Tassar) were subjected to stress-strain studies and the corresponding samples were examined using wide angle X-ray scattering studies. Here we have two different characteristic stress-strain curves and this has been correlated with changes in crystallite shape ellipsoids in all the fibers. Exclusive crystal structure studies of Tassar fibers show interesting feature of transformation from antiparallel chains to parallel chains.
Resumo:
The velocity distribution for a vibrated granular material is determined in the dilute limit where the frequency of particle collisions with the vibrating surface is large compared to the frequency of binary collisions. The particle motion is driven by the source of energy due to particle collisions with the vibrating surface, and two dissipation mechanisms-inelastic collisions and air drag-are considered. In the latter case, a general form for the drag force is assumed. First, the distribution function for the vertical velocity for a single particle colliding with a vibrating surface is determined in the limit where the dissipation during a collision due to inelasticity or between successive collisions due to drag is small compared to the energy of a particle. In addition, two types of amplitude functions for the velocity of the surface, symmetric and asymmetric about zero velocity, are considered. In all cases, differential equations for the distribution of velocities at the vibrating surface are obtained using a flux balance condition in velocity space, and these are solved to determine the distribution function. It is found that the distribution function is a Gaussian distribution when the dissipation is due to inelastic collisions and the amplitude function is symmetric, and the mean square velocity scales as [[U-2](s)/(1 - e(2))], where [U-2](s) is the mean square velocity of the vibrating surface and e is the coefficient of restitution. The distribution function is very different from a Gaussian when the dissipation is due to air drag and the amplitude function is symmetric, and the mean square velocity scales as ([U-2](s)g/mu(m))(1/(m+2)) when the acceleration due to the fluid drag is -mu(m)u(y)\u(y)\(m-1), where g is the acceleration due to gravity. For an asymmetric amplitude function, the distribution function at the vibrating surface is found to be sharply peaked around [+/-2[U](s)/(1-e)] when the dissipation is due to inelastic collisions, and around +/-[(m +2)[U](s)g/mu(m)](1/(m+1)) when the dissipation is due to fluid drag, where [U](s) is the mean velocity of the surface. The distribution functions are compared with numerical simulations of a particle colliding with a vibrating surface, and excellent agreement is found with no adjustable parameters. The distribution function for a two-dimensional vibrated granular material that includes the first effect of binary collisions is determined for the system with dissipation due to inelastic collisions and the amplitude function for the velocity of the vibrating surface is symmetric in the limit delta(I)=(2nr)/(1 - e)much less than 1. Here, n is the number of particles per unit width and r is the particle radius. In this Limit, an asymptotic analysis is used about the Limit where there are no binary collisions. It is found that the distribution function has a power-law divergence proportional to \u(x)\((c delta l-1)) in the limit u(x)-->0, where u(x) is the horizontal velocity. The constant c and the moments of the distribution function are evaluated from the conservation equation in velocity space. It is found that the mean square velocity in the horizontal direction scales as O(delta(I)T), and the nontrivial third moments of the velocity distribution scale as O(delta(I)epsilon(I)T(3/2)) where epsilon(I) = (1 - e)(1/2). Here, T = [2[U2](s)/(1 - e)] is the mean square velocity of the particles.
Resumo:
An understanding of application I/O access patterns is useful in several situations. First, gaining insight into what applications are doing with their data at a semantic level helps in designing efficient storage systems. Second, it helps create benchmarks that mimic realistic application behavior closely. Third, it enables autonomic systems as the information obtained can be used to adapt the system in a closed loop.All these use cases require the ability to extract the application-level semantics of I/O operations. Methods such as modifying application code to associate I/O operations with semantic tags are intrusive. It is well known that network file system traces are an important source of information that can be obtained non-intrusively and analyzed either online or offline. These traces are a sequence of primitive file system operations and their parameters. Simple counting, statistical analysis or deterministic search techniques are inadequate for discovering application-level semantics in the general case, because of the inherent variation and noise in realistic traces.In this paper, we describe a trace analysis methodology based on Profile Hidden Markov Models. We show that the methodology has powerful discriminatory capabilities that enable it to recognize applications based on the patterns in the traces, and to mark out regions in a long trace that encapsulate sets of primitive operations that represent higher-level application actions. It is robust enough that it can work around discrepancies between training and target traces such as in length and interleaving with other operations. We demonstrate the feasibility of recognizing patterns based on a small sampling of the trace, enabling faster trace analysis. Preliminary experiments show that the method is capable of learning accurate profile models on live traces in an online setting. We present a detailed evaluation of this methodology in a UNIX environment using NFS traces of selected commonly used applications such as compilations as well as on industrial strength benchmarks such as TPC-C and Postmark, and discuss its capabilities and limitations in the context of the use cases mentioned above.
Resumo:
Vapour adsorption refrigeration systems (VAdS) have the advantage of scalability over a wide range of capacities ranging from a few watts to several kilowatts. In the first instance, the design of a system requires the characteristics of the adsorbate-adsorbent pair. Invariably, the void volume in the adsorbent reduces the throughput of the thermal compressor in a manner similar to the clearance volume in a reciprocating compressor. This paper presents a study of the activated carbon +HFC-134a (1,1,1,2-tetrafluoroethane) system as a possible pair for a typical refrigeration application. The aim of this study is to unfold the nexus between the adsorption parameters, achievable packing densities of charcoal and throughput of a thermal compressor. It is shown that for a thermal compressor, the adsorbent should not only have a high surface area, but should also be able to provide a high packing density. Given the adsorption characteristics of an adsorbent-adsorbate pair and the operating conditions, this paper discloses a method for the calculation of the minimum packing density necessary for an effective throughput of a thermal compressor. (C) 2002 Elsevier Science Ltd. All rights reserved.
Resumo:
Fetal lung and liver tissues were examined by ultrasound in 240 subjects during 24 to 38 weeks of gestational age in order to investigate the feasibility of predicting the maturity of the lung from the textural features of sonograms. A region of interest of 64 X 64 pixels is used for extracting textural features. Since the histological properties of the liver are claimed to remain constant with respect to gestational age, features obtained from the lung region are compared with those from liver. Though the mean values of some of the features show a specific trend with respect to gestation age, the variance is too high to guarantee definite prediction of the gestational age. Thus, we restricted our purview to an investigation into the feasibility of fetal lung maturity prediction using statistical textural features. Out of 64 features extracted, those features that are correlated with gestation age and less computationally intensive are selected. The results of our study show that the sonographic features hold some promise in determining whether the fetal lung is mature or immature.
Resumo:
Transition in the boundary layer on a flat plate is examined from the point of view of intermittent production of turbulent spots. On the hypothesis of localized laminar breakdown, for which there is some expermental evidence, Emmons’ probability calculations can be extended to explain the observed statistical similarity of transition regions. Application of these ideas allows detailed calculations of the boundary layer parameters including mean velocity profiles and skin friction during transition. The mean velocity profiles belong to a universal one-parameter family with the intermittency factor as the parameter. From an examination of experimental data the probable existence of a relation between the transition Reynolds number and the rate of production of the turbulent spots is deduced. A simple new technique for the measurement of the intermittency factor by a Pitot tube is reported.
Resumo:
We propose a scheme for the compression of tree structured intermediate code consisting of a sequence of trees specified by a regular tree grammar. The scheme is based on arithmetic coding, and the model that works in conjunction with the coder is automatically generated from the syntactical specification of the tree language. Experiments on data sets consisting of intermediate code trees yield compression ratios ranging from 2.5 to 8, for file sizes ranging from 167 bytes to 1 megabyte.
Resumo:
Unintentionally doped homoepitaxial InSb films have been grown by liquid phase epitaxy employing ramp cooling and step cooling growth modes. The effect of growth temperature, degree of supercooling and growth duration on the surface morphology and crystallinity were investigated. The major surface features of the grown film like terracing, inclusions, meniscus lines, etc are presented step-by-step and a variety of methods devised to overcome such undesirable features are described in sufficient detail. The optimization of growth parameters have led to the growth of smooth and continuous films. From the detailed morphological, X-ray diffraction, scanning electron microscopic and Raman studies, a correlation between the surface morphology and crystallinity has been established.