18 resultados para IS-enabled Innovation Framework

em Indian Institute of Science - Bangalore - Índia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We address the problem of allocating a single divisible good to a number of agents. The agents have concave valuation functions parameterized by a scalar type. The agents report only the type. The goal is to find allocatively efficient, strategy proof, nearly budget balanced mechanisms within the Groves class. Near budget balance is attained by returning as much of the received payments as rebates to agents. Two performance criteria are of interest: the maximum ratio of budget surplus to efficient surplus, and the expected budget surplus, within the class of linear rebate functions. The goal is to minimize them. Assuming that the valuation functions are known, we show that both problems reduce to convex optimization problems, where the convex constraint sets are characterized by a continuum of half-plane constraints parameterized by the vector of reported types. We then propose a randomized relaxation of these problems by sampling constraints. The relaxed problem is a linear programming problem (LP). We then identify the number of samples needed for ``near-feasibility'' of the relaxed constraint set. Under some conditions on the valuation function, we show that value of the approximate LP is close to the optimal value. Simulation results show significant improvements of our proposed method over the Vickrey-Clarke-Groves (VCG) mechanism without rebates. In the special case of indivisible goods, the mechanisms in this paper fall back to those proposed by Moulin, by Guo and Conitzer, and by Gujar and Narahari, without any need for randomization. Extension of the proposed mechanisms to situations when the valuation functions are not known to the central planner are also discussed. Note to Practitioners-Our results will be useful in all resource allocation problems that involve gathering of information privately held by strategic users, where the utilities are any concave function of the allocations, and where the resource planner is not interested in maximizing revenue, but in efficient sharing of the resource. Such situations arise quite often in fair sharing of internet resources, fair sharing of funds across departments within the same parent organization, auctioning of public goods, etc. We study methods to achieve near budget balance by first collecting payments according to the celebrated VCG mechanism, and then returning as much of the collected money as rebates. Our focus on linear rebate functions allows for easy implementation. The resulting convex optimization problem is solved via relaxation to a randomized linear programming problem, for which several efficient solvers exist. This relaxation is enabled by constraint sampling. Keeping practitioners in mind, we identify the number of samples that assures a desired level of ``near-feasibility'' with the desired confidence level. Our methodology will occasionally require subsidy from outside the system. We however demonstrate via simulation that, if the mechanism is repeated several times over independent instances, then past surplus can support the subsidy requirements. We also extend our results to situations where the strategic users' utility functions are not known to the allocating entity, a common situation in the context of internet users and other problems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper investigates the use of adaptive group testing to find a spectrum hole of a specified bandwidth in a given wideband of interest. We propose a group testing-based spectrum hole search algorithm that exploits sparsity in the primary spectral occupancy by testing a group of adjacent subbands in a single test. This is enabled by a simple and easily implementable sub-Nyquist sampling scheme for signal acquisition by the cognitive radios (CRs). The sampling scheme deliberately introduces aliasing during signal acquisition, resulting in a signal that is the sum of signals from adjacent subbands. Energy-based hypothesis tests are used to provide an occupancy decision over the group of subbands, and this forms the basis of the proposed algorithm to find contiguous spectrum holes of a specified bandwidth. We extend this framework to a multistage sensing algorithm that can be employed in a variety of spectrum sensing scenarios, including noncontiguous spectrum hole search. Furthermore, we provide the analytical means to optimize the group tests with respect to the detection thresholds, number of samples, group size, and number of stages to minimize the detection delay under a given error probability constraint. Our analysis allows one to identify the sparsity and SNR regimes where group testing can lead to significantly lower detection delays compared with a conventional bin-by-bin energy detection scheme; the latter is, in fact, a special case of the group test when the group size is set to 1 bin. We validate our analytical results via Monte Carlo simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clock synchronization is highly desirable in distributed systems, including many applications in the Internet of Things and Humans. It improves the efficiency, modularity, and scalability of the system, and optimizes use of event triggers. For IoTH, BLE - a subset of the recent Bluetooth v4.0 stack - provides a low-power and loosely coupled mechanism for sensor data collection with ubiquitous units (e.g., smartphones and tablets) carried by humans. This fundamental design paradigm of BLE is enabled by a range of broadcast advertising modes. While its operational benefits are numerous, the lack of a common time reference in the broadcast mode of BLE has been a fundamental limitation. This article presents and describes CheepSync, a time synchronization service for BLE advertisers, especially tailored for applications requiring high time precision on resource constrained BLE platforms. Designed on top of the existing Bluetooth v4.0 standard, the CheepSync framework utilizes low-level time-stamping and comprehensive error compensation mechanisms for overcoming uncertainties in message transmission, clock drift, and other system-specific constraints. CheepSync was implemented on custom designed nRF24Cheep beacon platforms (as broadcasters) and commercial off-the-shelf Android ported smartphones (as passive listeners). We demonstrate the efficacy of CheepSync by numerous empirical evaluations in a variety of experimental setups, and show that its average (single-hop) time synchronization accuracy is in the 10 mu s range.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Two new neutral copper-azido polymers [Cu-3(N-3)(6)(tmen)(2)](n)(1)and [Cu-6(N-3)(12)(deen)(2)](n) (2) [tmen = N,N,N, N-tetramethylethylenediamine and deen = N,N-diethylethylenediamine] have been synthesized by using lower molar equivalents of the chelating diamine ligands with Cu(NO3)(2)center dot 3H(2)O and an excess of NaN3. The single crystal X-ray structure shows that in the basic unit of the 1D complex 1, the three Cu-II ions are linked by double end-on azido bridges with Cu-N-EO-Cu angles on both sides of the magnetic exchange critical angle of 108 degrees. Complex 2 is a 3D framework of a basic u-6 cluster. Cryomagnetic susceptibility measurements over a wide range of temperature exhibit dominant ferromagnetic behavior in both the complexes. Density functional theory calculations (B3LYP functional) have been performed on the trinuclear unit to provide a qualitative theoretical interpretation of the overall ferromagnetic behavior shown by the complex 1.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Image fusion is a formal framework which is expressed as means and tools for the alliance of multisensor, multitemporal, and multiresolution data. Multisource data vary in spectral, spatial and temporal resolutions necessitating advanced analytical or numerical techniques for enhanced interpretation capabilities. This paper reviews seven pixel based image fusion techniques - intensity-hue-saturation, brovey, high pass filter (HPF), high pass modulation (HPM), principal component analysis, fourier transform and correspondence analysis.Validation of these techniques on IKONOS data (Panchromatic band at I m spatial resolution and Multispectral 4 bands at 4 in spatial resolution) reveal that HPF and HPM methods synthesises the images closest to those the corresponding multisensors would observe at the high resolution level.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A low correlation interleaved QAM sequence family is presented here. In a CDMA setting, these sequences have the ability to transport a large amount of data as well as enable variable-rate signaling on the reverse link. The new interleaved selected family INQ has period N, normalized maximum correlation parameter thetasmacrmax bounded above by lsim a radicN, where a ranges from 1.17 in the 16-QAM case to 1.99 for large M2-QAM, where M = 2m, m ges 2. Each user is enabled to transfer m + 1 bits of data per period of the spreading sequence. These constructions have the lowest known value of maximum correlation of any sequence family with the same alphabet.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequent episode discovery is a popular framework for mining data available as a long sequence of events. An episode is essentially a short ordered sequence of event types and the frequency of an episode is some suitable measure of how often the episode occurs in the data sequence. Recently,we proposed a new frequency measure for episodes based on the notion of non-overlapped occurrences of episodes in the event sequence, and showed that, such a definition, in addition to yielding computationally efficient algorithms, has some important theoretical properties in connecting frequent episode discovery with HMM learning. This paper presents some new algorithms for frequent episode discovery under this non-overlapped occurrences-based frequency definition. The algorithms presented here are better (by a factor of N, where N denotes the size of episodes being discovered) in terms of both time and space complexities when compared to existing methods for frequent episode discovery. We show through some simulation experiments, that our algorithms are very efficient. The new algorithms presented here have arguably the least possible orders of spaceand time complexities for the task of frequent episode discovery.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequent episode discovery framework is a popular framework in temporal data mining with many applications. Over the years, many different notions of frequencies of episodes have been proposed along with different algorithms for episode discovery. In this paper, we present a unified view of all the apriori-based discoverymethods for serial episodes under these different notions of frequencies. Specifically, we present a unified view of the various frequency counting algorithms. We propose a generic counting algorithm such that all current algorithms are special cases of it. This unified view allows one to gain insights into different frequencies, and we present quantitative relationships among different frequencies.Our unified view also helps in obtaining correctness proofs for various counting algorithms as we show here. It also aids in understanding and obtaining the anti-monotonicity properties satisfied by the various frequencies, the properties exploited by the candidate generation step of any apriori-based method. We also point out how our unified view of counting helps to consider generalization of the algorithm to count episodes with general partial orders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequent episode discovery is a popular framework for temporal pattern discovery in event streams. An episode is a partially ordered set of nodes with each node associated with an event type. Currently algorithms exist for episode discovery only when the associated partial order is total order (serial episode) or trivial (parallel episode). In this paper, we propose efficient algorithms for discovering frequent episodes with unrestricted partial orders when the associated event-types are unique. These algorithms can be easily specialized to discover only serial or parallel episodes. Also, the algorithms are flexible enough to be specialized for mining in the space of certain interesting subclasses of partial orders. We point out that frequency alone is not a sufficient measure of interestingness in the context of partial order mining. We propose a new interestingness measure for episodes with unrestricted partial orders which, when used along with frequency, results in an efficient scheme of data mining. Simulations are presented to demonstrate the effectiveness of our algorithms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequent episode discovery framework is a popular framework in temporal data mining with many applications. Over the years, many different notions of frequencies of episodes have been proposed along with different algorithms for episode discovery. In this paper, we present a unified view of all the apriori-based discovery methods for serial episodes under these different notions of frequencies. Specifically, we present a unified view of the various frequency counting algorithms. We propose a generic counting algorithm such that all current algorithms are special cases of it. This unified view allows one to gain insights into different frequencies, and we present quantitative relationships among different frequencies. Our unified view also helps in obtaining correctness proofs for various counting algorithms as we show here. It also aids in understanding and obtaining the anti-monotonicity properties satisfied by the various frequencies, the properties exploited by the candidate generation step of any apriori-based method. We also point out how our unified view of counting helps to consider generalization of the algorithm to count episodes with general partial orders.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multiwavelength data indicate that the X-ray-emitting plasma in the cores of galaxy clusters is not cooling catastrophically. To a large extent, cooling is offset by heating due to active galactic nuclei (AGNs) via jets. The cool-core clusters, with cooler/denser plasmas, show multiphase gas and signs of some cooling in their cores. These observations suggest that the cool core is locally thermally unstable while maintaining global thermal equilibrium. Using high-resolution, three-dimensional simulations we study the formation of multiphase gas in cluster cores heated by collimated bipolar AGN jets. Our key conclusion is that spatially extended multiphase filaments form only when the instantaneous ratio of the thermal instability and free-fall timescales (t(TI)/t(ff)) falls below a critical threshold of approximate to 10. When this happens, dense cold gas decouples from the hot intracluster medium (ICM) phase and generates inhomogeneous and spatially extended Ha filaments. These cold gas clumps and filaments ``rain'' down onto the central regions of the core, forming a cold rotating torus and in part feeding the supermassive black hole. Consequently, the self-regulated feedback enhances AGN heating and the core returns to a higher entropy level with t(TI)/t(ff) > 10. Eventually, the core reaches quasi-stable global thermal equilibrium, and cold filaments condense out of the hot ICM whenever t(TI)/t(ff) less than or similar to 10. This occurs despite the fact that the energy from AGN jets is supplied to the core in a highly anisotropic fashion. The effective spatial redistribution of heat is enabled in part by the turbulent motions in the wake of freely falling cold filaments. Increased AGN activity can locally reverse the cold gas flow, launching cold filamentary gas away from the cluster center. Our criterion for the condensation of spatially extended cold gas is in agreement with observations and previous idealized simulations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Frequent episode discovery is a popular framework for pattern discovery from sequential data. It has found many applications in domains like alarm management in telecommunication networks, fault analysis in the manufacturing plants, predicting user behavior in web click streams and so on. In this paper, we address the discovery of serial episodes. In the episodes context, there have been multiple ways to quantify the frequency of an episode. Most of the current algorithms for episode discovery under various frequencies are apriori-based level-wise methods. These methods essentially perform a breadth-first search of the pattern space. However currently there are no depth-first based methods of pattern discovery in the frequent episode framework under many of the frequency definitions. In this paper, we try to bridge this gap. We provide new depth-first based algorithms for serial episode discovery under non-overlapped and total frequencies. Under non-overlapped frequency, we present algorithms that can take care of span constraint and gap constraint on episode occurrences. Under total frequency we present an algorithm that can handle span constraint. We provide proofs of correctness for the proposed algorithms. We demonstrate the effectiveness of the proposed algorithms by extensive simulations. We also give detailed run-time comparisons with the existing apriori-based methods and illustrate scenarios under which the proposed pattern-growth algorithms perform better than their apriori counterparts. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We develop an optical system for generating multiple light sheets. This is enabled by employing a special class of spatial filters in a cylindrical lens geometry. The proposed binary filter placed at the back aperture of the cylindrical lens results in the generation of a periodic transverse pattern extending along the z axis (i.e., multiple light sheets). Experimental results confirm the generation of multiple light sheets of thickness 6.6 mu m with an intersheet spacing of 13.4 mu m. The proposed imaging technique may facilitate three-dimensional imaging in nano-optics, fluorescence microscopy, and nanobiology. (C) 2014 Optical Society of America

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Due to boom in telecommunications market, there is hectic competition among the cellular handset manufacturers. As cellular manufacturing industry operates in an oligopoly framework, often price-rigidity leads to non-price wars. The handset manufacturing firms indulge in product innovation and also advertise their products in order to achieve their objective of maximizing discounted flow of profit. It is of interest to see what would be the optimal advertisement-innovation mix that would maximize the discounted How of profit for the firms. We used differential game theory to solve this problem. We adopted the open-loop solution methodology. We experimented for various scenarios over a 30 period horizon and derived interesting managerial insights.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Digital human modeling (DHM) involves modeling of structure, form and functional capabilities of human users for ergonomics simulation. This paper presents application of geometric procedures for investigating the characteristics of human visual capabilities which are particularly important in the context mentioned above. Using the cone of unrestricted directions through the pupil on a tessellated head model as the geometric interpretation of the clinical field-of-view (FoV), the results obtained are experimentally validated. Estimating the pupil movement for a given gaze direction using Listing's Law, FoVs are re-computed. Significant variation of the FoV is observed with the variation in gaze direction. A novel cube-grid representation, which integrated the unit-cube representation of directions and the enhanced slice representation has been introduced for fast and exact point classification for point visibility analysis for a given FoV. Computation of containment frequency of every grid-cell for a given set of FoVs enabled determination of percentile-based FoV contours for estimating the visual performance of a given population. This is a new concept which makes visibility analysis more meaningful from ergonomics point-of-view. The algorithms are fast enough to support interactive analysis of reasonably complex scenes on a typical desktop computer. (C) 2011 Elsevier Ltd. All rights reserved.