40 resultados para automated meter reading (AMR)
Resumo:
The problems in measuring thermal emittance by steady?state calorimetric technique have been analyzed. A few suggestions to make it more accurate, simple, and rapid have been discussed and results are presented.
Resumo:
We describe an automated calorimeter for measurement of specific heat in the temperature range 10 K>T>0.5 K. It uses sample of moderate size (100–1000 mg), has a moderate precision and accuracy (2%–5%), is easy to operate and the measurements can be done quickly with He4 economy. The accuracy of this calorimeter was checked by measurement of specific heat of copper and that of aluminium near its superconducting transition temperature.
Resumo:
A fully automated, versatile Temperature Programmed Desorption (TDP), Temperature Programmed Reaction (TPR) and Evolved Gas Analysis (EGA) system has been designed and fabricated. The system consists of a micro-reactor which can be evacuated to 10−6 torr and can be heated from 30 to 750°C at a rate of 5 to 30°C per minute. The gas evolved from the reactor is analysed by a quadrupole mass spectrometer (1–300 amu). Data on each of the mass scans and the temperature at a given time are acquired by a PC/AT system to generate thermograms. The functioning of the system is exemplified by the temperature programmed desorption (TPD) of oxygen from YBa2Cu3−xCoxO7 ± δ, catalytic ammonia oxidation to NO over YBa2Cu3O7−δ and anaerobic oxidation of methanol to CO2, CO and H2O over YBa2Cu3O7−δ (Y123) and PrBa2Cu3O7−δ (Pr123) systems.
Resumo:
We present a framework for performance evaluation of manufacturing systems subject to failure and repair. In particular, we determine the mean and variance of accumulated production over a specified time frame and show the usefulness of these results in system design and in evaluating operational policies for manufacturing systems. We extend this analysis for lead time as well. A detailed performability study is carried out for the generic model of a manufacturing system with centralized material handling. Several numerical results are presented, and the relevance of performability analysis in resolving system design issues is highlighted. Specific problems addressed include computing the distribution of total production over a shift period, determining the shift length necessary to deliver a given production target with a desired probability, and obtaining the distribution of Manufacturing Lead Time, all in the face of potential subsystem failures.
Resumo:
This paper addresses the problem of automated multiagent search in an unknown environment. Autonomous agents equipped with sensors carry out a search operation in a search space, where the uncertainty, or lack of information about the environment, is known a priori as an uncertainty density distribution function. The agents are deployed in the search space to maximize single step search effectiveness. The centroidal Voronoi configuration, which achieves a locally optimal deployment, forms the basis for the proposed sequential deploy and search strategy. It is shown that with the proposed control law the agent trajectories converge in a globally asymptotic manner to the centroidal Voronoi configuration. Simulation experiments are provided to validate the strategy. Note to Practitioners-In this paper, searching an unknown region to gather information about it is modeled as a problem of using search as a means of reducing information uncertainty about the region. Moreover, multiple automated searchers or agents are used to carry out this operation optimally. This problem has many applications in search and surveillance operations using several autonomous UAVs or mobile robots. The concept of agents converging to the centroid of their Voronoi cells, weighted with the uncertainty density, is used to design a search strategy named as sequential deploy and search. Finally, the performance of the strategy is validated using simulations.
Resumo:
The theory, design, and performance of a solid electrolyte twin thermocell for the direct determination of the partial molar entropy of oxygen in a single-phase or multiphase mixture are described. The difference between the Seebeck coefficients of the concentric thermocells is directly related to the difference in the partial molar entropy of oxygen in the electrodes of each thermocell. The measured potentials are sensitive to small deviations from equilibrium at the electrodes. Small electric disturbances caused by simultaneous potential measurements or oxygen fluxes caused by large oxygen potential gradients between the electrodes also disturb the thermoelectric potential. An accuracy of ±0.5 calth K−1 mol−1 has been obtained by this method for the entropies of formation of NiO and NiAl2O4. This “entropy meter” may be used for the measurement of the entropies of formation of simple or complex oxides with significant residual contributions which cannot be detected by heat-capacity measurements.
Resumo:
In this paper, we propose an approach, using Coloured Petri Nets (CPN) for modelling flexible manufacturing systems. We illustrate our methodology for a Flexible Manufacturing Cell (FMC) with three machines and three robots. We also consider the analysis of the FMC for deadlocks using the invariant analysis of CPNs.
Resumo:
We propose a new paradigm for displaying comments: showing comments alongside parts of the article they correspond to. We evaluate the effectiveness of various approaches for this task and show that a combination of bag of words and topic models performs the best.
Resumo:
Western Blot analysis is an analytical technique used in Molecular Biology, Biochemistry, Immunogenetics and other Molecular Biology studies to separate proteins by electrophoresis. The procedure results in images containing nearly rectangular-shaped blots. In this paper, we address the problem of quantitation of the blots using automated image processing techniques. We formulate a special active contour (or snake) called Oblong, which locks on to rectangular shaped objects. Oblongs depend on five free parameters, which is also the minimum number of parameters required for a unique characterization. Unlike many snake formulations, Oblongs do not require explicit gradient computations and therefore the optimization is carried out fast. The performance of Oblongs is assessed on synthesized data and Western Blot Analysis images.
Resumo:
Faraday-type electromagnetic flow meters are employed for measuring the flow rate of liquid sodium in fast breeder reactors. The calibration of such flow meters, owing to the required elaborative arrangements is rather difficult. On the other hand, theoretical approach requires solution of two coupled electromagnetic partial differential equation with profile of the flow and applied magnetic field as the inputs. This is also quite involved due to the 3D nature of the problem. Alternatively, Galerkin finite element method based numerical solution is suggested in the literature as an attractive option for the required calibration. Based on the same, a computer code in Matlab platform has been developed in this work with both 20 and 27 node brick elements. The boundary conditions are correctly defined and several intermediate validation exercises are carried out. Finally it is shown that the sensitivities predicted by the code for flow meters of four different dimensions agrees well with the results given by analytical expression, thereby providing strong validation. Sensitivity for higher flow rates, for which analytical approach does not exist, is shown to decrease with increase in flow velocity.
Resumo:
Purpose: Developing a computationally efficient automated method for the optimal choice of regularization parameter in diffuse optical tomography. Methods: The least-squares QR (LSQR)-type method that uses Lanczos bidiagonalization is known to be computationally efficient in performing the reconstruction procedure in diffuse optical tomography. The same is effectively deployed via an optimization procedure that uses the simplex method to find the optimal regularization parameter. The proposed LSQR-type method is compared with the traditional methods such as L-curve, generalized cross-validation (GCV), and recently proposed minimal residual method (MRM)-based choice of regularization parameter using numerical and experimental phantom data. Results: The results indicate that the proposed LSQR-type and MRM-based methods performance in terms of reconstructed image quality is similar and superior compared to L-curve and GCV-based methods. The proposed method computational complexity is at least five times lower compared to MRM-based method, making it an optimal technique. Conclusions: The LSQR-type method was able to overcome the inherent limitation of computationally expensive nature of MRM-based automated way finding the optimal regularization parameter in diffuse optical tomographic imaging, making this method more suitable to be deployed in real-time. (C) 2013 American Association of Physicists in Medicine. http://dx.doi.org/10.1118/1.4792459]
Resumo:
Adaptive Mesh Refinement is a method which dynamically varies the spatio-temporal resolution of localized mesh regions in numerical simulations, based on the strength of the solution features. In-situ visualization plays an important role for analyzing the time evolving characteristics of the domain structures. Continuous visualization of the output data for various timesteps results in a better study of the underlying domain and the model used for simulating the domain. In this paper, we develop strategies for continuous online visualization of time evolving data for AMR applications executed on GPUs. We reorder the meshes for computations on the GPU based on the users input related to the subdomain that he wants to visualize. This makes the data available for visualization at a faster rate. We then perform asynchronous executions of the visualization steps and fix-up operations on the CPUs while the GPU advances the solution. By performing experiments on Tesla S1070 and Fermi C2070 clusters, we found that our strategies result in 60% improvement in response time and 16% improvement in the rate of visualization of frames over the existing strategy of performing fix-ups and visualization at the end of the timesteps.
Resumo:
The ribosomal P-site hosts the peptidyl-tRNAs during translation elongation. Which P-site elements support these tRNA species to maintain codon-anticodon interactions has remained unclear. We investigated the effects of P-site features of methylations of G966, C967, and the conserved C-terminal tail sequence of Ser, Lys, and Arg (SKR) of the S9 ribosomal protein in maintenance of the translational reading frame of an mRNA. We generated Escherichia coli strains deleted for the SKR sequence in S9 ribosomal protein, RsmB (which methylates C967), and RsmD (which methylates G966) and used them to translate LacZ from its +1 and -1 out-of-frame constructs. We show that the S9 SKR tail prevents both the +1 and -1 frameshifts and plays a general role in holding the P-site tRNA/peptidyl-tRNA in place. In contrast, the G966 and C967 methylations did not make a direct contribution to the maintenance of the translational frame of an mRNA. However, deletion of rsmB in the S9 Delta 3 background caused significantly increased -1 frameshifting at 37 degrees C. Interestingly, the effects of the deficiency of C967 methylation were annulled when the E. coli strain was grown at 30 degrees C, supporting its context-dependent role.
Resumo:
Network theory applied to protein structures provides insights into numerous problems of biological relevance. The explosion in structural data available from PDB and simulations establishes a need to introduce a standalone-efficient program that assembles network concepts/parameters under one hood in an automated manner. Herein, we discuss the development/application of an exhaustive, user-friendly, standalone program package named PSN-Ensemble, which can handle structural ensembles generated through molecular dynamics (MD) simulation/NMR studies or from multiple X-ray structures. The novelty in network construction lies in the explicit consideration of side-chain interactions among amino acids. The program evaluates network parameters dealing with topological organization and long-range allosteric communication. The introduction of a flexible weighing scheme in terms of residue pairwise cross-correlation/interaction energy in PSN-Ensemble brings in dynamical/chemical knowledge into the network representation. Also, the results are mapped on a graphical display of the structure, allowing an easy access of network analysis to a general biological community. The potential of PSN-Ensemble toward examining structural ensemble is exemplified using MD trajectories of an ubiquitin-conjugating enzyme (UbcH5b). Furthermore, insights derived from network parameters evaluated using PSN-Ensemble for single-static structures of active/inactive states of 2-adrenergic receptor and the ternary tRNA complexes of tyrosyl tRNA synthetases (from organisms across kingdoms) are discussed. PSN-Ensemble is freely available from http://vishgraph.mbu.iisc.ernet.in/PSN-Ensemble/psn_index.html.
Resumo:
Objective identification and description of mimicked calls is a primary component of any study on avian vocal mimicry but few studies have adopted a quantitative approach. We used spectral feature representations commonly used in human speech analysis in combination with various distance metrics to distinguish between mimicked and non-mimicked calls of the greater racket-tailed drongo, Dicrurus paradiseus and cross-validated the results with human assessment of spectral similarity. We found that the automated method and human subjects performed similarly in terms of the overall number of correct matches of mimicked calls to putative model calls. However, the two methods also misclassified different subsets of calls and we achieved a maximum accuracy of ninety five per cent only when we combined the results of both the methods. This study is the first to use Mel-frequency Cepstral Coefficients and Relative Spectral Amplitude - filtered Linear Predictive Coding coefficients to quantify vocal mimicry. Our findings also suggest that in spite of several advances in automated methods of song analysis, corresponding cross-validation by humans remains essential.