16 resultados para PHARMACY-BASED MEASURES

em Indian Institute of Science - Bangalore - Índia


Relevância:

40.00% 40.00%

Publicador:

Resumo:

We present a new approach to spoken language modeling for language identification (LID) using the Lempel-Ziv-Welch (LZW) algorithm. The LZW technique is applicable to any kind of tokenization of the speech signal. Because of the efficiency of LZW algorithm to obtain variable length symbol strings in the training data, the LZW codebook captures the essentials of a language effectively. We develop two new deterministic measures for LID based on the LZW algorithm namely: (i) Compression ratio score (LZW-CR) and (ii) weighted discriminant score (LZW-WDS). To assess these measures, we consider error-free tokenization of speech as well as artificially induced noise in the tokenization. It is shown that for a 6 language LID task of OGI-TS database with clean tokenization, the new model (LZW-WDS) performs slightly better than the conventional bigram model. For noisy tokenization, which is the more realistic case, LZW-WDS significantly outperforms the bigram technique

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monitoring gas purity is an important aspect of gas recovery stations where air is usually one of the major impurities. Purity monitors of Katherometric type ate commercially available for this purpose. Alternatively, we discuss here a helium gas purity monitor based on acoustic resonance of a cavity at audio frequencies. It measures the purity by monitoring the resonant frequency of a cylindrical cavity filled with the gas under test and excited by conventional telephone transducers fixed at the ends. The use of the latter simplifies the design considerably. The paper discusses the details of the resonant cavity and the electronic circuit along with temperature compensation. The unit has been calibrated with helium gas of known purities. The unit has a response time of the order of 10 minutes and measures the gas purity to an accuracy of 0.02%. The unit has been installed in our helium recovery system and is found to perform satisfactorily.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Time-frequency analysis of various simulated and experimental signals due to elastic wave scattering from damage are performed using wavelet transform (WT) and Hilbert-Huang transform (HHT) and their performances are compared in context of quantifying the damages. Spectral finite element method is employed for numerical simulation of wave scattering. An analytical study is carried out to study the effects of higher-order damage parameters on the reflected wave from a damage. Based on this study, error bounds are computed for the signals in the spectral and also on the time-frequency domains. It is shown how such an error bound can provide all estimate of error in the modelling of wave propagation in structure with damage. Measures of damage based on WT and HHT is derived to quantify the damage information hidden in the signal. The aim of this study is to obtain detailed insights into the problem of (1) identifying localised damages (2) dispersion of multifrequency non-stationary signals after they interact with various types of damage and (3) quantifying the damages. Sensitivity analysis of the signal due to scattered wave based on time-frequency representation helps to correlate the variation of damage index measures with respect to the damage parameters like damage size and material degradation factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper the approach for automatic road extraction for an urban region using structural, spectral and geometric characteristics of roads has been presented. Roads have been extracted based on two levels: Pre-processing and road extraction methods. Initially, the image is pre-processed to improve the tolerance by reducing the clutter (that mostly represents the buildings, parking lots, vegetation regions and other open spaces). The road segments are then extracted using Texture Progressive Analysis (TPA) and Normalized cut algorithm. The TPA technique uses binary segmentation based on three levels of texture statistical evaluation to extract road segments where as, Normalizedcut method for road extraction is a graph based method that generates optimal partition of road segments. The performance evaluation (quality measures) for road extraction using TPA and normalized cut method is compared. Thus the experimental result show that normalized cut method is efficient in extracting road segments in urban region from high resolution satellite image.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we present a cache coherence protocol for multistage interconnection network (MIN)-based multiprocessors with two distinct private caches: private-blocks caches (PCache) containing blocks private to a process and shared-blocks caches (SCache) containing data accessible by all processes. The architecture is extended by a coherence control bus connecting all shared-block cache controllers. Timing problems due to variable transit delays through the MIN are dealt with by introducing Transient states in the proposed cache coherence protocol. The impact of the coherence protocol on system performance is evaluated through a performance study of three phases. Assuming homogeneity of all nodes, a single-node queuing model (phase 3) is developed to analyze system performance. This model is solved for processor and coherence bus utilizations using the mean value analysis (MVA) technique with shared-blocks steady state probabilities (phase 1) and communication delays (phase 2) as input parameters. The performance of our system is compared to that of a system with an equivalent-sized unified cache and with a multiprocessor implementing a directory-based coherence protocol. System performance measures are verified through simulation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new class of nets, called S-nets, is introduced for the performance analysis of scheduling algorithms used in real-time systems Deterministic timed Petri nets do not adequately model the scheduling of resources encountered in real-time systems, and need to be augmented with resource places and signal places, and a scheduler block, to facilitate the modeling of scheduling algorithms. The tokens are colored, and the transition firing rules are suitably modified. Further, the concept of transition folding is used, to get intuitively simple models of multiframe real-time systems. Two generic performance measures, called �load index� and �balance index,� which characterize the resource utilization and the uniformity of workload distribution, respectively, are defined. The utility of S-nets for evaluating heuristic-based scheduling schemes is illustrated by considering three heuristics for real-time scheduling. S-nets are useful in tuning the hardware configuration and the underlying scheduling policy, so that the system utilization is maximized, and the workload distribution among the computing resources is balanced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new approach based on occupation measures is introduced for studying stochastic differential games. For two-person zero-sum games, the existence of values and optimal strategies for both players is established for various payoff criteria. ForN-person games, the existence of equilibria in Markov strategies is established for various cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We are concerned with the situation in which a wireless sensor network is deployed in a region, for the purpose of detecting an event occurring at a random time and at a random location. The sensor nodes periodically sample their environment (e.g., for acoustic energy),process the observations (in our case, using a CUSUM-based algorithm) and send a local decision (which is binary in nature) to the fusion centre. The fusion centre collects these local decisions and uses a fusion rule to process the sensors’ local decisions and infer the state of nature, i.e., if an event has occurred or not. Our main contribution is in analyzing two local detection rules in combination with a simple fusion rule. The local detection algorithms are based on the nonparametric CUSUMprocedure from sequential statistics. We also propose two ways to operate the local detectors after an alarm. These alternatives when combined in various ways yield several approaches. Our contribution is to provide analytical techniques to calculate false alarm measures, by the use of which the local detector thresholds can be set. Simulation results are provided to evaluate the accuracy of our analysis. As an illustration we provide a design example. We also use simulations to compare the detection delays incurred in these algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Analysis of high resolution satellite images has been an important research topic for urban analysis. One of the important features of urban areas in urban analysis is the automatic road network extraction. Two approaches for road extraction based on Level Set and Mean Shift methods are proposed. From an original image it is difficult and computationally expensive to extract roads due to presences of other road-like features with straight edges. The image is preprocessed to improve the tolerance by reducing the noise (the buildings, parking lots, vegetation regions and other open spaces) and roads are first extracted as elongated regions, nonlinear noise segments are removed using a median filter (based on the fact that road networks constitute large number of small linear structures). Then road extraction is performed using Level Set and Mean Shift method. Finally the accuracy for the road extracted images is evaluated based on quality measures. The 1m resolution IKONOS data has been used for the experiment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Realization of cloud computing has been possible due to availability of virtualization technologies on commodity platforms. Measuring resource usage on the virtualized servers is difficult because of the fact that the performance counters used for resource accounting are not virtualized. Hence, many of the prevalent virtualization technologies like Xen, VMware, KVM etc., use host specific CPU usage monitoring, which is coarse grained. In this paper, we present a performance monitoring tool for KVM based virtualized machines, which measures the CPU overhead incurred by the hypervisor on behalf of the virtual machine along-with the CPU usage of virtual machine itself. This fine-grained resource usage information, provided by the above tool, can be used for diverse situations like resource provisioning to support performance associated QoS requirements, identification of bottlenecks during VM placements, resource profiling of applications in cloud environments, etc. We demonstrate a use case of this tool by measuring the performance of web-servers hosted on a KVM based virtualized server.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research has been undertaken to ascertain the predictability of non-stationary time series using wavelet and Empirical Mode Decomposition (EMD) based time series models. Methods have been developed in the past to decompose a time series into components. Forecasting of these components combined with random component could yield predictions. Using this ideology, wavelet and EMD analyses have been incorporated separately which decomposes a time series into independent orthogonal components with both time and frequency localizations. The component series are fit with specific auto-regressive models to obtain forecasts which are later combined to obtain the actual predictions. Four non-stationary streamflow sites (USGS data resources) of monthly total volumes and two non-stationary gridded rainfall sites (IMD) of monthly total rainfall are considered for the study. The predictability is checked for six and twelve months ahead forecasts across both the methodologies. Based on performance measures, it is observed that wavelet based method has better prediction capabilities over EMD based method despite some of the limitations of time series methods and the manner in which decomposition takes place. Finally, the study concludes that the wavelet based time series algorithm can be used to model events such as droughts with reasonable accuracy. Also, some modifications that can be made in the model have been discussed that could extend the scope of applicability to other areas in the field of hydrology. (C) 2013 Elesvier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Feeding 9-10billion people by 2050 and preventing dangerous climate change are two of the greatest challenges facing humanity. Both challenges must be met while reducing the impact of land management on ecosystem services that deliver vital goods and services, and support human health and well-being. Few studies to date have considered the interactions between these challenges. In this study we briefly outline the challenges, review the supply- and demand-side climate mitigation potential available in the Agriculture, Forestry and Other Land Use AFOLU sector and options for delivering food security. We briefly outline some of the synergies and trade-offs afforded by mitigation practices, before presenting an assessment of the mitigation potential possible in the AFOLU sector under possible future scenarios in which demand-side measures codeliver to aid food security. We conclude that while supply-side mitigation measures, such as changes in land management, might either enhance or negatively impact food security, demand-side mitigation measures, such as reduced waste or demand for livestock products, should benefit both food security and greenhouse gas (GHG) mitigation. Demand-side measures offer a greater potential (1.5-15.6Gt CO2-eq. yr(-1)) in meeting both challenges than do supply-side measures (1.5-4.3Gt CO2-eq. yr(-1) at carbon prices between 20 and 100US$ tCO(2)-eq. yr(-1)), but given the enormity of challenges, all options need to be considered. Supply-side measures should be implemented immediately, focussing on those that allow the production of more agricultural product per unit of input. For demand-side measures, given the difficulties in their implementation and lag in their effectiveness, policy should be introduced quickly, and should aim to codeliver to other policy agenda, such as improving environmental quality or improving dietary health. These problems facing humanity in the 21st Century are extremely challenging, and policy that addresses multiple objectives is required now more than ever.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The increasing number of available protein structures requires efficient tools for multiple structure comparison. Indeed, multiple structural alignments are essential for the analysis of function, evolution and architecture of protein structures. For this purpose, we proposed a new web server called multiple Protein Block Alignment (mulPBA). This server implements a method based on a structural alphabet to describe the backbone conformation of a protein chain in terms of dihedral angles. This sequence-like' representation enables the use of powerful sequence alignment methods for primary structure comparison, followed by an iterative refinement of the structural superposition. This approach yields alignments superior to most of the rigid-body alignment methods and highly comparable with the flexible structure comparison approaches. We implement this method in a web server designed to do multiple structure superimpositions from a set of structures given by the user. Outputs are given as both sequence alignment and superposed 3D structures visualized directly by static images generated by PyMol or through a Jmol applet allowing dynamic interaction. Multiple global quality measures are given. Relatedness between structures is indicated by a distance dendogram. Superimposed structures in PDB format can be also downloaded, and the results are quickly obtained. mulPBA server can be accessed at www.dsimb.inserm.fr/dsimb_tools/mulpba/.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performance of prediction models is often based on ``abstract metrics'' that estimate the model's ability to limit residual errors between the observed and predicted values. However, meaningful evaluation and selection of prediction models for end-user domains requires holistic and application-sensitive performance measures. Inspired by energy consumption prediction models used in the emerging ``big data'' domain of Smart Power Grids, we propose a suite of performance measures to rationally compare models along the dimensions of scale independence, reliability, volatility and cost. We include both application independent and dependent measures, the latter parameterized to allow customization by domain experts to fit their scenario. While our measures are generalizable to other domains, we offer an empirical analysis using real energy use data for three Smart Grid applications: planning, customer education and demand response, which are relevant for energy sustainability. Our results underscore the value of the proposed measures to offer a deeper insight into models' behavior and their impact on real applications, which benefit both data mining researchers and practitioners.