111 resultados para feature based cost


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A resource interaction based game theoretical model for military conflicts is presented in this paper. The model includes both the spatial decision capability of adversaries (decision regarding movement and subsequent distribution of resources) as well as their temporal decision capability (decision regarding level of allocation of resources for conflict with adversary’s resources). Attrition is decided at present by simple deterministic models. An additional feature of this model is the inclusion of the possibility of a given resource interacting with several resources of the adversary.The decisions of the adversaries is determined by solving for the equilibrium Nash strategies given that the objectives of the adversaries may not be in direct conflict. Examples are given to show the applicability of these models and solution concepts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When hosting XML information on relational backends, a mapping has to be established between the schemas of the information source and the target storage repositories. A rich body of recent literature exists for mapping isolated components of XML Schema to their relational counterparts, especially with regard to table configurations. In this paper, we present the Elixir system for designing industrial-strength mappings for real-world applications. Specifically, it produces an information-preserving holistic mapping that transforms the complete XML world-view (XML schema with constraints, XML documents XQuery queries including triggers and views) into a full-scale relational mapping (table definitions, integrity constraints, indices, triggers and views) that is tuned to the application workload. A key design feature of Elixir is that it performs all its mapping-related optimizations in the XML source space, rather than in the relational target space. Further, unlike the XML mapping tools of commercial database systems, which rely heavily on user inputs, Elixir takes a principled cost-based approach to automatically find an efficient relational mapping. A prototype of Elixir is operational and we quantitatively demonstrate its functionality and efficacy on a variety of real-life XML schemas.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Solar cells on thin conformable substrates require conventional plastics such asPS and PMMA that provide better mechanical and environmental stability with cost reduction. We can also tune charge transfer between PPV derivatives and fullerene derivatives via morphology control of the plastics in the solar cells. Our group has conducted morphology evolution studies in nano- and microscale light emitting domains in poly (2-methoxy, 5-(2'-ethyl-hexyloxy)-p-phenylenevinylene) (MEH-PPV) and poly (methyl methacrylate) (PMMA) blends. Our current research has been focused on tricomponent-photoactive solar cells which comprise MEH-PPV, PMMA, and [6,6]-phenyl C61-butyric acid methyl ester (PCBM, Figure 1) in the photoactive layer. Morphology control of the photoactive materials and fine tuning of photovoltaic properties for the solar cells are our primary interest. Similar work has been done by the Sariciftci research group. Additionally, a study on inter- and intramolecular photoinduced charge transfer using MEH-PPV derivatives that have different conjugation lengths (Figure 1, n=1 and 0.85) has been performed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recent developments in our laboratory related to polymer-based light sensors are reviewed. The inherent processibility of the active polymer medium is utilized in the implementation of different designs for the opto-electronic applications. The utility of these devices as sensitive photodetectors, image sensors and position sensitive detectors is demonstrated. The schottky-type layer formation at interfaces of polymers such as polyalkylthiophenes and aluminum accompanied by the enhanced photo-induced charge separation due to high local electric field is tapped for some of these device structures. The sensitivity of polymer-based field effect transistors to light also provides a convenient lateral geometry for efficient optical-coupling and control of the transistor state. ne range of these polymer-detectors available with the option of operating in the diode and transistor modes should be an attractive feature for many potential applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The highest levels of security can be achieved through the use of more than one type of cryptographic algorithm for each security function. In this paper, the REDEFINE polymorphic architecture is presented as an architecture framework that can optimally support a varied set of crypto algorithms without losing high performance. The presented solution is capable of accelerating the advanced encryption standard (AES) and elliptic curve cryptography (ECC) cryptographic protocols, while still supporting different flavors of these algorithms as well as different underlying finite field sizes. The compelling feature of this cryptosystem is the ability to provide acceleration support for new field sizes as well as new (possibly proprietary) cryptographic algorithms decided upon after the cryptosystem is deployed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background & objectives: There is a need to develop an affordable and reliable tool for hearing screening of neonates in resource constrained, medically underserved areas of developing nations. This study valuates a strategy of health worker based screening of neonates using a low cost mechanical calibrated noisemaker followed up with parental monitoring of age appropriate auditory milestones for detecting severe-profound hearing impairment in infants by 6 months of age. Methods: A trained health worker under the supervision of a qualified audiologist screened 425 neonates of whom 20 had confirmed severe-profound hearing impairment. Mechanical calibrated noisemakers of 50, 60, 70 and 80 dB (A) were used to elicit the behavioural responses. The parents of screened neonates were instructed to monitor the normal language and auditory milestones till 6 months of age. This strategy was validated against the reference standard consisting of a battery of tests - namely, auditory brain stem response (ABR), otoacoustic emissions (OAE) and behavioural assessment at 2 years of age. Bayesian prevalence weighted measures of screening were calculated. Results: The sensitivity and specificity was high with least false positive referrals for. 70 and 80 dB (A) noisemakers. All the noisemakers had 100 per cent negative predictive value. 70 and 80 dB (A) noisemakers had high positive likelihood ratios of 19 and 34, respectively. The probability differences for pre- and post- test positive was 43 and 58 for 70 and 80 dB (A) noisemakers, respectively. Interpretation & conclusions: In a controlled setting, health workers with primary education can be trained to use a mechanical calibrated noisemaker made of locally available material to reliably screen for severe-profound hearing loss in neonates. The monitoring of auditory responses could be done by informed parents. Multi-centre field trials of this strategy need to be carried out to examine the feasibility of community health care workers using it in resource constrained settings of developing nations to implement an effective national neonatal hearing screening programme.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a decentralized/peer-to-peer architecture-based parallel version of the vector evaluated particle swarm optimization (VEPSO) algorithm for multi-objective design optimization of laminated composite plates using message passing interface (MPI). The design optimization of laminated composite plates being a combinatorially explosive constrained non-linear optimization problem (CNOP), with many design variables and a vast solution space, warrants the use of non-parametric and heuristic optimization algorithms like PSO. Optimization requires minimizing both the weight and cost of these composite plates, simultaneously, which renders the problem multi-objective. Hence VEPSO, a multi-objective variant of the PSO algorithm, is used. Despite the use of such a heuristic, the application problem, being computationally intensive, suffers from long execution times due to sequential computation. Hence, a parallel version of the PSO algorithm for the problem has been developed to run on several nodes of an IBM P720 cluster. The proposed parallel algorithm, using MPI's collective communication directives, establishes a peer-to-peer relationship between the constituent parallel processes, deviating from the more common master-slave approach, in achieving reduction of computation time by factor of up to 10. Finally we show the effectiveness of the proposed parallel algorithm by comparing it with a serial implementation of VEPSO and a parallel implementation of the vector evaluated genetic algorithm (VEGA) for the same design problem. (c) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effectiveness of the last-level shared cache is crucial to the performance of a multi-core system. In this paper, we observe and make use of the DelinquentPC - Next-Use characteristic to improve shared cache performance. We propose a new PC-centric cache organization, NUcache, for the shared last level cache of multi-cores. NUcache logically partitions the associative ways of a cache set into MainWays and DeliWays. While all lines have access to the MainWays, only lines brought in by a subset of delinquent PCs, selected by a PC selection mechanism, are allowed to enter the DeliWays. The PC selection mechanism is an intelligent cost-benefit analysis based algorithm that utilizes Next-Use information to select the set of PCs that can maximize the hits experienced in DeliWays. Performance evaluation reveals that NUcache improves the performance over a baseline design by 9.6%, 30% and 33% respectively for dual, quad and eight core workloads comprised of SPEC benchmarks. We also show that NUcache is more effective than other well-known cache-partitioning algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we develop a game theoretic approach for clustering features in a learning problem. Feature clustering can serve as an important preprocessing step in many problems such as feature selection, dimensionality reduction, etc. In this approach, we view features as rational players of a coalitional game where they form coalitions (or clusters) among themselves in order to maximize their individual payoffs. We show how Nash Stable Partition (NSP), a well known concept in the coalitional game theory, provides a natural way of clustering features. Through this approach, one can obtain some desirable properties of the clusters by choosing appropriate payoff functions. For a small number of features, the NSP based clustering can be found by solving an integer linear program (ILP). However, for large number of features, the ILP based approach does not scale well and hence we propose a hierarchical approach. Interestingly, a key result that we prove on the equivalence between a k-size NSP of a coalitional game and minimum k-cut of an appropriately constructed graph comes in handy for large scale problems. In this paper, we use feature selection problem (in a classification setting) as a running example to illustrate our approach. We conduct experiments to illustrate the efficacy of our approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We address the classical problem of delta feature computation, and interpret the operation involved in terms of Savitzky- Golay (SG) filtering. Features such as themel-frequency cepstral coefficients (MFCCs), obtained based on short-time spectra of the speech signal, are commonly used in speech recognition tasks. In order to incorporate the dynamics of speech, auxiliary delta and delta-delta features, which are computed as temporal derivatives of the original features, are used. Typically, the delta features are computed in a smooth fashion using local least-squares (LS) polynomial fitting on each feature vector component trajectory. In the light of the original work of Savitzky and Golay, and a recent article by Schafer in IEEE Signal Processing Magazine, we interpret the dynamic feature vector computation for arbitrary derivative orders as SG filtering with a fixed impulse response. This filtering equivalence brings in significantly lower latency with no loss in accuracy, as validated by results on a TIMIT phoneme recognition task. The SG filters involved in dynamic parameter computation can be viewed as modulation filters, proposed by Hermansky.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a multilevel inverter topology suitable for the generation of dodecagonal space vectors instead of hexagonal space vectors as in the case of conventional schemes. This feature eliminates all the 6n +/- 1 (n = odd) harmonics from the phase voltages and currents in the entire modulation range with an increase in the linear modulation range. The topology is realized by flying capacitor-based three-level inverters feeding from two ends of an open-end winding induction motor with asymmetric dc links. The flying capacitor voltages are tightly controlled throughout the modulation range using redundant switching states for any load power factor. A simple and fast carrier-based space-vector pulsewidth modulation (PWM) scheme is also proposed for the topology which utilizes only the sampled amplitudes of the reference wave for the PWM timing computation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of speech enhancement algorithms is to provide an estimate of clean speech starting from noisy observations. The often-employed cost function is the mean square error (MSE). However, the MSE can never be computed in practice. Therefore, it becomes necessary to find practical alternatives to the MSE. In image denoising problems, the cost function (also referred to as risk) is often replaced by an unbiased estimator. Motivated by this approach, we reformulate the problem of speech enhancement from the perspective of risk minimization. Some recent contributions in risk estimation have employed Stein's unbiased risk estimator (SURE) together with a parametric denoising function, which is a linear expansion of threshold/bases (LET). We show that the first-order case of SURE-LET results in a Wiener-filter type solution if the denoising function is made frequency-dependent. We also provide enhancement results obtained with both techniques and characterize the improvement by means of local as well as global SNR calculations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present a methodology for identifying best features from a large feature space. In high dimensional feature space nearest neighbor search is meaningless. In this feature space we see quality and performance issue with nearest neighbor search. Many data mining algorithms use nearest neighbor search. So instead of doing nearest neighbor search using all the features we need to select relevant features. We propose feature selection using Non-negative Matrix Factorization(NMF) and its application to nearest neighbor search. Recent clustering algorithm based on Locally Consistent Concept Factorization(LCCF) shows better quality of document clustering by using local geometrical and discriminating structure of the data. By using our feature selection method we have shown further improvement of performance in the clustering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper primarily intends to develop a GIS (geographical information system)-based data mining approach for optimally selecting the locations and determining installed capacities for setting up distributed biomass power generation systems in the context of decentralized energy planning for rural regions. The optimal locations within a cluster of villages are obtained by matching the installed capacity needed with the demand for power, minimizing the cost of transportation of biomass from dispersed sources to power generation system, and cost of distribution of electricity from the power generation system to demand centers or villages. The methodology was validated by using it for developing an optimal plan for implementing distributed biomass-based power systems for meeting the rural electricity needs of Tumkur district in India consisting of 2700 villages. The approach uses a k-medoid clustering algorithm to divide the total region into clusters of villages and locate biomass power generation systems at the medoids. The optimal value of k is determined iteratively by running the algorithm for the entire search space for different values of k along with demand-supply matching constraints. The optimal value of the k is chosen such that it minimizes the total cost of system installation, costs of transportation of biomass, and transmission and distribution. A smaller region, consisting of 293 villages was selected to study the sensitivity of the results to varying demand and supply parameters. The results of clustering are represented on a GIS map for the region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Outlier detection in high dimensional categorical data has been a problem of much interest due to the extensive use of qualitative features for describing the data across various application areas. Though there exist various established methods for dealing with the dimensionality aspect through feature selection on numerical data, the categorical domain is actively being explored. As outlier detection is generally considered as an unsupervised learning problem due to lack of knowledge about the nature of various types of outliers, the related feature selection task also needs to be handled in a similar manner. This motivates the need to develop an unsupervised feature selection algorithm for efficient detection of outliers in categorical data. Addressing this aspect, we propose a novel feature selection algorithm based on the mutual information measure and the entropy computation. The redundancy among the features is characterized using the mutual information measure for identifying a suitable feature subset with less redundancy. The performance of the proposed algorithm in comparison with the information gain based feature selection shows its effectiveness for outlier detection. The efficacy of the proposed algorithm is demonstrated on various high-dimensional benchmark data sets employing two existing outlier detection methods.