57 resultados para Process Automation

em Indian Institute of Science - Bangalore - Índia


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gaussian processes (GPs) are promising Bayesian methods for classification and regression problems. Design of a GP classifier and making predictions using it is, however, computationally demanding, especially when the training set size is large. Sparse GP classifiers are known to overcome this limitation. In this letter, we propose and study a validation-based method for sparse GP classifier design. The proposed method uses a negative log predictive (NLP) loss measure, which is easy to compute for GP models. We use this measure for both basis vector selection and hyperparameter adaptation. The experimental results on several real-world benchmark data sets show better orcomparable generalization performance over existing methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gaussian Processes (GPs) are promising Bayesian methods for classification and regression problems. They have also been used for semi-supervised learning tasks. In this paper, we propose a new algorithm for solving semi-supervised binary classification problem using sparse GP regression (GPR) models. It is closely related to semi-supervised learning based on support vector regression (SVR) and maximum margin clustering. The proposed algorithm is simple and easy to implement. It gives a sparse solution directly unlike the SVR based algorithm. Also, the hyperparameters are estimated easily without resorting to expensive cross-validation technique. Use of sparse GPR model helps in making the proposed algorithm scalable. Preliminary results on synthetic and real-world data sets demonstrate the efficacy of the new algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process control rules may be specified using decision tables. Such a specification is superior when logical decisions to be taken in control dominate. In this paper we give a method of detecting redundancies, incompleteness, and contradictions in such specifications. Using such a technique thus ensures the validity of the specifications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the problem of analyzing influence of various factors affecting individual messages posted in social media. The problem is challenging because of various types of influences propagating through the social media network that act simultaneously on any user. Additionally, the topic composition of the influencing factors and the susceptibility of users to these influences evolve over time. This problem has not been studied before, and off-the-shelf models are unsuitable for this purpose. To capture the complex interplay of these various factors, we propose a new non-parametric model called the Dynamic Multi-Relational Chinese Restaurant Process. This accounts for the user network for data generation and also allows the parameters to evolve over time. Designing inference algorithms for this model suited for large scale social-media data is another challenge. To this end, we propose a scalable and multi-threaded inference algorithm based on online Gibbs Sampling. Extensive evaluations on large-scale Twitter and Face book data show that the extracted topics when applied to authorship and commenting prediction outperform state-of-the-art baselines. More importantly, our model produces valuable insights on topic trends and user personality trends beyond the capability of existing approaches.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Modeling and analysis of wave propagation in elastic solids undergoing damage and growth process are reported in this paper. Two types of diagnostic problems, (1) the propagation of waves in the presence of a slow growth process and (2) the propagation of waves in the presence of a fast growth process, are considered. The proposed model employs a slow and a fast time scale and a homogenization technique in the wavelength scale. A detailed analysis of wave dispersion is carried out. A spectral analysis reveals certain low-frequency bands, where the interaction between the wave and the growth process produces acoustic metamaterial-like behavior. Various practical issues in designing an efficient method of acousto-ultrasonic wave based diagnostics of the growth process are discussed. Diagnostics of isotropic damage in a ductile or quasi-brittle solid by using a micro-second pulsating signal is considered for computer simulations, which is to illustrate the practical application of the proposed modeling and analysis. The simulated results explain how an estimate of signal spreading can be effectively employed to detect the presence of a steady-state damage or the saturation of a process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The precipitation processes in dilute nitrogen alloys of titanium have been examined in detail by conventional transmission electron microscopy (CTEM) and high-resolution electron microscopy (HREM). The alloy Ti-2 at. pct N on quenching from its high-temperature beta phase field has been found to undergo early stages of decomposition. The supersaturated solid solution (alpha''-hcp) on decomposition gives rise to an intimately mixed, irresolvable product microstructure. The associated strong tweed contrast presents difficulties in understanding the characteristic features of the process. Therefore, HREM has been carried out with a view to getting a clear picture of the decomposition process. Studies on the quenched samples of the alloy suggest the formation of solute-rich zones of a few atom layers thick, randomly distributed throughout the matrix. On aging, these zones grow to a size beyond which the precipitate/matrix interfaces appear to become incoherent and the alpha' (tetragonal) product phase is seen distinctly. The structural details, the crystallography of the precipitation process, and the sequence of precipitation reaction in the system are illustrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There are essentially two different phenomenological models available to describe the interdiffusion process in binary systems in the olid state. The first of these, which is used more frequently, is based on the theory of flux partitioning. The second model, developed much more recently, uses the theory of dissociation and reaction. Although the theory of flux partitioning has been widely used, we found that this theory does not account for the mobility of both species and therefore is not suitable for use in most interdiffusion systems. We have first modified this theory to take into account the mobility of both species and then further extended it to develop relations or the integrated diffusion coefficient and the ratio of diffusivities of the species. The versatility of these two different models is examined in the Co-Si system with respect to different end-member compositions. From our analysis, we found that the applicability of the theory of flux partitioning is rather limited but the theory of dissociation and reaction can be used in any binary system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method has been developed for the removal of chromium using ferrous sulphide generated in situ. The effects of experimental parameters such as pH, reagent dosages, interference from cations and chelating agents have been investigated. Under optimum conditions, removal efficiencies of 99 and 97% for synthetic and industrial samples have been obtained. The method offers all the advantages of sulphide precipitation process and can be adopted easily for industrial effluents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of deciding whether the output of a boolean circuit is determined by a partial assignment to its inputs. This problem is easily shown to be hard, i.e., co-Image Image -complete. However, many of the consequences of a partial input assignment may be determined in linear time, by iterating the following step: if we know the values of some inputs to a gate, we can deduce the values of some outputs of that gate. This process of iteratively deducing some of the consequences of a partial assignment is called propagation. This paper explores the parallel complexity of propagation, i.e., the complexity of determining whether the output of a given boolean circuit is determined by propagating a given partial input assignment. We give a complete classification of the problem into those cases that are Image -complete and those that are unlikely to be Image complete.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A numerical model of the entire casting process starting from the mould filling stage to complete solidification is presented. The model takes into consideration any phase change taking place during the filling process. A volume of fluid method is used for tracking the metal–air interface during filling and an enthalpy based macro-scale solidification model is used for the phase change process. The model is demonstrated for the case of filling and solidification of Pb–15 wt%Sn alloy in a side-cooled two-dimensional rectangular cavity, and the resulting evolution of a mushy region and macrosegregation are studied. The effects of process parameters related to filling, namely degree of melt superheat and filling velocity on macrosegregation in the cavity, are also investigated. Results show significant differences in the progress of the mushy zone and macrosegregation pattern between this analysis and conventional analysis without the filling effect.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Boron carbide is produced in a heat resistance furnace using boric oxide and petroleum coke as the raw materials. The product yield is very low. Heat transfer plays an important role in the formation of boron carbide. Temperature at the core reaches up to 2600 K. No experimental study is available in the open literature for this high temperature process particularly in terms of temperature measurement and heat transfer. Therefore, a laboratory scale hot model of the process has been setup to measure the temperatures in harsh conditions at different locations in the furnace using various temperature measurement devices such as pyrometer and various types of thermocouple. Particular attention was paid towards the accuracy and reliability of the measured data. The recorded data were analysed to understand the heat transfer process inside the reactor and the effect of it on the formation of boron carbide.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The development of techniques for scaling up classifiers so that they can be applied to problems with large datasets of training examples is one of the objectives of data mining. Recently, AdaBoost has become popular among machine learning community thanks to its promising results across a variety of applications. However, training AdaBoost on large datasets is a major problem, especially when the dimensionality of the data is very high. This paper discusses the effect of high dimensionality on the training process of AdaBoost. Two preprocessing options to reduce dimensionality, namely the principal component analysis and random projection are briefly examined. Random projection subject to a probabilistic length preserving transformation is explored further as a computationally light preprocessing step. The experimental results obtained demonstrate the effectiveness of the proposed training process for handling high dimensional large datasets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents on overview of the issues in precisely defining, specifying and evaluating the dependability of software, particularly in the context of computer controlled process systems. Dependability is intended to be a generic term embodying various quality factors and is useful for both software and hardware. While the developments in quality assurance and reliability theories have proceeded mostly in independent directions for hardware and software systems, we present here the case for developing a unified framework of dependability—a facet of operational effectiveness of modern technological systems, and develop a hierarchical systems model helpful in clarifying this view. In the second half of the paper, we survey the models and methods available for measuring and improving software reliability. The nature of software “bugs”, the failure history of the software system in the various phases of its lifecycle, the reliability growth in the development phase, estimation of the number of errors remaining in the operational phase, and the complexity of the debugging process have all been considered to varying degrees of detail. We also discuss the notion of software fault-tolerance, methods of achieving the same, and the status of other measures of software dependability such as maintainability, availability and safety.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We consider the problem of tracking a maneuvering target in clutter. In such an environment, missed detections and false alarms make it impossible to decide, with certainty, the origin of received echoes. Processing radar returns in cluttered environments consists of three functions: 1) target detection and plot formation, 2) plot-to-track association, and 3) track updating. Two inadequacies of the present approaches are 1) Optimization of detection characteristics have not been considered and 2) features that can be used in the plot-to-track correlation process are restricted to a specific class. This paper presents a new approach to overcome these limitations. This approach facilitates tracking of a maneuvering target in clutter and improves tracking performance for weak targets.