29 resultados para e-government strategy analysis

em Indian Institute of Science - Bangalore - Índia


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we present an improved load distribution strategy, for arbitrarily divisible processing loads, to minimize the processing time in a distributed linear network of communicating processors by an efficient utilization of their front-ends. Closed-form solutions are derived, with the processing load originating at the boundary and at the interior of the network, under some important conditions on the arrangement of processors and links in the network. Asymptotic analysis is carried out to explore the ultimate performance limits of such networks. Two important theorems are stated regarding the optimal load sequence and the optimal load origination point. Comparative study of this new strategy with an earlier strategy is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Tuberculosis still remains one of the largest killer infectious diseases, warranting the identification of newer targets and drugs. Identification and validation of appropriate targets for designing drugs are critical steps in drug discovery, which are at present major bottle-necks. A majority of drugs in current clinical use for many diseases have been designed without the knowledge of the targets, perhaps because standard methodologies to identify such targets in a high-throughput fashion do not really exist. With different kinds of 'omics' data that are now available, computational approaches can be powerful means of obtaining short-lists of possible targets for further experimental validation. Results: We report a comprehensive in silico target identification pipeline, targetTB, for Mycobacterium tuberculosis. The pipeline incorporates a network analysis of the protein-protein interactome, a flux balance analysis of the reactome, experimentally derived phenotype essentiality data, sequence analyses and a structural assessment of targetability, using novel algorithms recently developed by us. Using flux balance analysis and network analysis, proteins critical for survival of M. tuberculosis are first identified, followed by comparative genomics with the host, finally incorporating a novel structural analysis of the binding sites to assess the feasibility of a protein as a target. Further analyses include correlation with expression data and non-similarity to gut flora proteins as well as 'anti-targets' in the host, leading to the identification of 451 high-confidence targets. Through phylogenetic profiling against 228 pathogen genomes, shortlisted targets have been further explored to identify broad-spectrum antibiotic targets, while also identifying those specific to tuberculosis. Targets that address mycobacterial persistence and drug resistance mechanisms are also analysed. Conclusion: The pipeline developed provides rational schema for drug target identification that are likely to have high rates of success, which is expected to save enormous amounts of money, resources and time in the drug discovery process. A thorough comparison with previously suggested targets in the literature demonstrates the usefulness of the integrated approach used in our study, highlighting the importance of systems-level analyses in particular. The method has the potential to be used as a general strategy for target identification and validation and hence significantly impact most drug discovery programmes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Data-flow analysis is an integral part of any aggressive optimizing compiler. We propose a framework for improving the precision of data-flow analysis in the presence of complex control-flow. W initially perform data-flow analysis to determine those control-flow merges which cause the loss in data-flow analysis precision. The control-flow graph of the program is then restructured such that performing data-flow analysis on the resulting restructured graph gives more precise results. The proposed framework is both simple, involving the familiar notion of product automata, and also general, since it is applicable to any forward data-flow analysis. Apart from proving that our restructuring process is correct, we also show that restructuring is effective in that it necessarily leads to more optimization opportunities. Furthermore, the framework handles the trade-off between the increase in data-flow precision and the code size increase inherent in the restructuring. We show that determining an optimal restructuring is NP-hard, and propose and evaluate a greedy strategy. The framework has been implemented in the Scale research compiler, and instantiated for the specific problem of Constant Propagation. On the SPECINT 2000 benchmark suite we observe an average speedup of 4% in the running times over Wegman-Zadeck conditional constant propagation algorithm and 2% over a purely path profile guided approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

India's energy challenges are multi-pronged. They are manifested through growing demand for modern energy carriers, a fossil fuel dominated energy system facing a severe resource crunch, the need for creating access to quality energy for the large section of deprived population, vulnerable energy security, local and global pollution regimes and the need for sustaining economic development. Renewable energy is considered as one of the most promising alternatives. Recognizing this potential, India has been implementing one of the largest renewable energy programmes in the world. Among the renewable energy technologies. bioenergy has a large diverse portfolio including efficient biomass stoves, biogas, biomass combustion and gasification and process heat and liquid fuels. India has also formulated and implemented a number of innovative policies and programmes to promote bioenergy technologies. However, according to some preliminary studies, the success rate is marginal compared to the potential available. This limited success is a clear indicator of the need for a serious reassessment of the bioenergy programme. Further, a realization of the need for adopting a sustainable energy path to address the above challenges will be the guiding force in this reassessment. In this paper an attempt is made to consider the potential of bioenergy to meet the rural energy needs: (I) biomass combustion and gasification for electricity; (2) biomethanation for cooking energy (gas) and electricity; and (3) efficient wood-burning devices for cooking. The paper focuses on analysing the effectiveness of bioenergy in creating this rural energy access and its sustainability in the long run through assessing: the demand for bioenergy and potential that could be created; technologies, status of commercialization and technology transfer and dissemination in India; economic and environmental performance and impacts: bioenergy policies, regulatory measures and barrier analysis. The whole assessment aims at presenting bioenergy as an integral part of a sustainable energy strategy for India. The results show that bioenergy technology (BET) alternatives compare favourably with the conventional ones. The cost comparisons show that the unit costs of BET alternatives are in the range of 15-187% of the conventional alternatives. The climate change benefits in terms of carbon emission reductions are to the tune of 110 T C per year provided the available potential of BETs are utilized.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Provision of modern energy services for cooking (with gaseous fuels)and lighting (with electricity) is an essential component of any policy aiming to address health, education or welfare issues; yet it gets little attention from policy-makers. Secure, adequate, low-cost energy of quality and convenience is core to the delivery of these services. The present study analyses the energy consumption pattern of Indian domestic sector and examines the urban-rural divide and income energy linkage. A comprehensive analysis is done to estimate the cost for providing modern energy services to everyone by 2030. A public-private partnership-driven business model, with entrepreneurship at the core, is developed with institutional, financing and pricing mechanisms for diffusion of energy services. This approach, termed as EMPOWERS (entrepreneurship model for provision of wholesome energy-related basic services), if adopted, can facilitate large-scale dissemination of energy-efficient and renewable technologies like small-scale biogas/biofuel plants, and distributed power generation technologies to provide clean, safe, reliable and sustainable energy to rural households and urban poor. It is expected to integrate the processes of market transformation and entrepreneurship development involving government, NGOs, financial institutions and community groups as stakeholders. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a self-regularized pseudo-time marching scheme to solve the ill-posed, nonlinear inverse problem associated with diffuse propagation of coherent light in a tissuelike object. In particular, in the context of diffuse correlation tomography (DCT), we consider the recovery of mechanical property distributions from partial and noisy boundary measurements of light intensity autocorrelation. We prove the existence of a minimizer for the Newton algorithm after establishing the existence of weak solutions for the forward equation of light amplitude autocorrelation and its Frechet derivative and adjoint. The asymptotic stability of the solution of the ordinary differential equation obtained through the introduction of the pseudo-time is also analyzed. We show that the asymptotic solution obtained through the pseudo-time marching converges to that optimal solution provided the Hessian of the forward equation is positive definite in the neighborhood of optimal solution. The superior noise tolerance and regularization-insensitive nature of pseudo-dynamic strategy are proved through numerical simulations in the context of both DCT and diffuse optical tomography. (C) 2010 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we introduce a nonlinear detector based on the phenomenon of suprathreshold stochastic resonance (SSR). We first present a model (an array of 1-bit quantizers) that demonstrates the SSR phenomenon. We then use this as a pre-processor to the conventional matched filter. We employ the Neyman-Pearson(NP) detection strategy and compare the performances of the matched filter, the SSR-based detector and the optimal detector. Although the proposed detector is non-optimal, for non-Gaussian noises with heavy tails (leptokurtic) it shows better performance than the matched filter. In situations where the noise is known to be leptokurtic without the availability of the exact knowledge of its distribution, the proposed detector turns out to be a better choice than the matched filter.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide a comparative performance analysis of network architectures for beacon enabled Zigbee sensor clusters using the CSMA/CA MAC defined in the IEEE 802.15.4 standard, and organised as (i) a star topology, and (ii) a two-hop topology. We provide analytical models for obtaining performance measures such as mean network delay, and mean node lifetime. We find that the star topology is substantially superior both in delay performance and lifetime performance than the two-hop topology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A posteriori error estimation and adaptive refinement technique for fracture analysis of 2-D/3-D crack problems is the state-of-the-art. The objective of the present paper is to propose a new a posteriori error estimator based on strain energy release rate (SERR) or stress intensity factor (SIF) at the crack tip region and to use this along with the stress based error estimator available in the literature for the region away from the crack tip. The proposed a posteriori error estimator is called the K-S error estimator. Further, an adaptive mesh refinement (h-) strategy which can be used with K-S error estimator has been proposed for fracture analysis of 2-D crack problems. The performance of the proposed a posteriori error estimator and the h-adaptive refinement strategy have been demonstrated by employing the 4-noded, 8-noded and 9-noded plane stress finite elements. The proposed error estimator together with the h-adaptive refinement strategy will facilitate automation of fracture analysis process to provide reliable solutions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The objective of the present paper is to select the best compromise irrigation planning strategy for the case study of Jayakwadi irrigation project, Maharashtra, India. Four-phase methodology is employed. In phase 1, separate linear programming (LP) models are formulated for the three objectives, namely. net economic benefits, agricultural production and labour employment. In phase 2, nondominated (compromise) irrigation planning strategies are generated using the constraint method of multiobjective optimisation. In phase 3, Kohonen neural networks (KNN) based classification algorithm is employed to sort nondominated irrigation planning strategies into smaller groups. In phase 4, multicriterion analysis (MCA) technique, namely, Compromise Programming is applied to rank strategies obtained from phase 3. It is concluded that the above integrated methodology is effective for modeling multiobjective irrigation planning problems and the present approach can be extended to situations where number of irrigation planning strategies are even large in number. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Hybrid elements, which are based on a two-field variational formulation with the displacements and stresses interpolated separately, are known to deliver very high accuracy, and to alleviate to a large extent problems of locking that plague standard displacement-based formulations. The choice of the stress interpolation functions is of course critical in ensuring the high accuracy and robustness of the method. Generally, an attempt is made to keep the stress interpolation to the minimum number of terms that will ensure that the stiffness matrix has no spurious zero-energy modes, since it is known that the stiffness increases with the increase in the number of terms. Although using such a strategy of keeping the number of interpolation terms to a minimum works very well in static problems, it results either in instabilities or fails to converge in transient problems. This is because choosing the stress interpolation functions merely on the basis of removing spurious energy modes can violate some basic principles that interpolation functions should obey. In this work, we address the issue of choosing the interpolation functions based on such basic principles of interpolation theory and mechanics. Although this procedure results in the use of more number of terms than the minimum (and hence in slightly increased stiffness) in many elements, we show that the performance continues to be far superior to displacement-based formulations, and, more importantly, that it also results in considerably increased robustness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A thermal model for a conventional biogas plant has been developed in order to understand the heat transfer from the slurry and the gas holder to the surrounding earth and air respectively. The computations have been performed for two conditions : (i) when the slurry is at an ambient temperature of 20°C, and (ii) when it is at 35°C, the optimum temperature for anaerobic fermentation. Under both these conditions, the gas holder is the major “culprit” with regard to heat losses from the biogas plant. The calculations provide an estimate for the heat which has to be supplied by external means to compensate for the net heat losses which occur if the slurry is to be maintained at 35°C. Even if this external supply of heat is realised through (the calorific value of) biogas, there is a net increase in the biogas output, and therefore a net benefit, by operating the plant at 35°C. At this elevated temperature, the cooling effect of adding the influent at ambient temperature is not insignificant. In conclusion, the results of the thermal analysis are used to define a strategy for operating biogas plants at optimum temperatures, or at higher temperatures than the ambient.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most important factors that affect the pointing of precision payloads and devices in space platforms is the vibration generated due to static and dynamic unbalanced forces of rotary equipments placed in the neighborhood of payload. Generally, such disturbances are of low amplitude, less than 1 kHz, and are termed as ‘micro-vibrations’. Due to low damping in the space structure, these vibrations have long decay time and they degrade the performance of payload. This paper addresses the design, modeling and analysis of a low frequency space frame platform for passive and active attenuation of micro-vibrations. This flexible platform has been designed to act as a mount for devices like reaction wheels, and consists of four folded continuous beams arranged in three dimensions. Frequency and response analysis have been carried out by varying the number of folds, and thickness of vertical beam. Results show that lower frequencies can be achieved by increasing the number of folds and by decreasing the thickness of the blade. In addition, active vibration control is studied by incorporating piezoelectric actuators and sensors in the dynamic model. It is shown using simulation that a control strategy using optimal control is effective for vibration suppression under a wide variety of loading conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Modal approach is widely used for the analysis of dynamics of flexible structures. However, space analysts yet lack an intimate modal analysis of current spacecraft which are rich with flexibility and possess both structural and discrete damping. Mathematical modeling of such spacecraft incapacitates the existing real transformation procedure, for it cannot include discrete damping, demands uncomputable inversion of a modal matrix inaccessible due to its overwhelming size and does not permit truncation. On the other hand, complex transformation techniques entail more computational time and cannot handle structural damping. This paper presents a real transformation strategy which averts inversion of the associated real transformation matrix, allows truncation and accommodates both forms of damping simultaneously. This is accomplished by establishing a key relation between the real transformation matrix and its adjoint. The relation permits truncation of the matrices and leads to uncoupled pairs of coupled first order equations which contain a number of adjoint eigenvectors. Finally these pairs are solved to obtain a literal modal response of forced gyroscopic damped flexibile systems at arbitrary initial conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Triplex forming oligonucleotides (TFOs) have the potential to modulate gene expression. While most of the experiments are directed towards triplex mediated inhibition of gene expression the strategy potentially could be used for gene specific activation. In an attempt to design a strategy for gene specific activation in vivo applicable to a large number of genes we have designed a TFO based activator-target system which may be utilized in Saccharomyces cerevisiae or any other system where Gal4 protein is ectopically expressed. The total genome sequence of Saccharomyces cerevisiae and expression profiles were used to select the target genes with upstream poly (pu/py) sequences. We have utilized the paradigm of Gal4 protein and its binding site. We describe here the selection of target genes and design of hairpin-TFO including the targeting sequences containing polypurine stretch found in the upstream promoter regions of weakly expressed genes. We demonstrate, the formation of hairpin-TFO, its binding to Gal4 protein, its ability to form triplex with the target duplex in vitro, the effect of polyethylenimine on complex formation and discuss the implication on in vivo transcription activation.