18 resultados para Case Based Computing

em Indian Institute of Science - Bangalore - Índia


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hydrogen bonds in biological macromolecules play significant structural and functional roles. They are the key contributors to most of the interactions without which no living system exists. In view of this, a web-based computing server, the Hydrogen Bonds Computing Server (HBCS), has been developed to compute hydrogen-bond interactions and their standard deviations for any given macromolecular structure. The computing server is connected to a locally maintained Protein Data Bank (PDB) archive. Thus, the user can calculate the above parameters for any deposited structure, and options have also been provided for the user to upload a structure in PDB format from the client machine. In addition, the server has been interfaced with the molecular viewers Jmol and JSmol to visualize the hydrogen-bond interactions. The proposed server is freely available and accessible via the World Wide Web at http://bioserver1.physics.iisc.ernet.in/hbcs/.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Twin NLO chromophores having two azobenzene units linked by a flexible polymethylene spacer of varying lengths are shown to exhibit odd-even oscillations in their second harmonic generation (SHG) efficiencies, when measured in the powder form. These twin systems were designed to also exhibit liquid cystallinity, and indeed most of them do exhibit a nematic mesophase. The anticipated odd-even oscillations, in both their isotropization transition temperatures (Ti) and isotropization entropies (Delta Si), were also observed. The odd-even oscillation of the SHG efficiencies has been ascribed to a more effective cancellation of mesogenic dipoles in the even twins as compared to their odd counterparts, due to a preferred centrosymmetric packing in the former case. Based on the behaviour of these twin chromophoric molecules, it may be anticipated that such odd-even oscillations will also be observed in the analogous main chain NLO polymers.

Relevância:

50.00% 50.00%

Publicador:

Resumo:

A computationally efficient approach that computes the optimal regularization parameter for the Tikhonov-minimization scheme is developed for photoacoustic imaging. This approach is based on the least squares-QR decomposition which is a well-known dimensionality reduction technique for a large system of equations. It is shown that the proposed framework is effective in terms of quantitative and qualitative reconstructions of initial pressure distribution enabled via finding an optimal regularization parameter. The computational efficiency and performance of the proposed method are shown using a test case of numerical blood vessel phantom, where the initial pressure is exactly known for quantitative comparison. (C) 2013 Society of Photo-Optical Instrumentation Engineers (SPIE)

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Pricing is an effective tool to control congestion and achieve quality of service (QoS) provisioning for multiple differentiated levels of service. In this paper, we consider the problem of pricing for congestion control in the case of a network of nodes under a single service class and multiple queues, and present a multi-layered pricing scheme. We propose an algorithm for finding the optimal state dependent price levels for individual queues, at each node. The pricing policy used depends on a weighted average queue length at each node. This helps in reducing frequent price variations and is in the spirit of the random early detection (RED) mechanism used in TCP/IP networks. We observe in our numerical results a considerable improvement in performance using our scheme over that of a recently proposed related scheme in terms of both throughput and delay performance. In particular, our approach exhibits a throughput improvement in the range of 34 to 69 percent in all cases studied (over all routes) over the above scheme.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

STOAT has been extensively used for the dynamic simulation of an activated sludge based wastewater treatment plant in the Titagarh Sewage Treatment Plant, near Kolkata, India. Some alternative schemes were suggested. Different schemes were compared for the removal of Total Suspended Solids (TSS), b-COD, ammonia, nitrates etc. A combination of IAWQ#1 module with the Takacs module gave best results for the existing scenarios of the Titagarh Sewage Treatment Plant. The modified Bardenpho process was found most effective for reducing the mean b-COD level to as low as 31.4 mg/l, while the mean TSS level was as high as 100.98 mg/l as compared to the mean levels of TSS (92 62 mg/l) and b-COD (92.0 mg/l) in the existing plant. Scheme 2 gave a better scenario for the mean TSS level bringing it down to a mean value of 0.4 mg/l, but a higher mean value for the b-COD level at 54.89 mg/l. The Scheme Final could reduce the mean TSS level to 2.9 mg/l and the mean b-COD level to as low as 38.8 mg/l. The Final Scheme looks to be a technically viable scheme with respect to the overall effluent quality for the plant. (C) 2009 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The problem of sensor-network-based distributed intrusion detection in the presence of clutter is considered. It is argued that sensing is best regarded as a local phenomenon in that only sensors in the immediate vicinity of an intruder are triggered. In such a setting, lack of knowledge of intruder location gives rise to correlated sensor readings. A signal-space viewpoint is introduced in which the noise-free sensor readings associated to intruder and clutter appear as surfaces $\mathcal{S_I}$ and $\mathcal{S_C}$ and the problem reduces to one of determining in distributed fashion, whether the current noisy sensor reading is best classified as intruder or clutter. Two approaches to distributed detection are pursued. In the first, a decision surface separating $\mathcal{S_I}$ and $\mathcal{S_C}$ is identified using Neyman-Pearson criteria. Thereafter, the individual sensor nodes interactively exchange bits to determine whether the sensor readings are on one side or the other of the decision surface. Bounds on the number of bits needed to be exchanged are derived, based on communication complexity (CC) theory. A lower bound derived for the two-party average case CC of general functions is compared against the performance of a greedy algorithm. The average case CC of the relevant greater-than (GT) function is characterized within two bits. In the second approach, each sensor node broadcasts a single bit arising from appropriate two-level quantization of its own sensor reading, keeping in mind the fusion rule to be subsequently applied at a local fusion center. The optimality of a threshold test as a quantization rule is proved under simplifying assumptions. Finally, results from a QualNet simulation of the algorithms are presented that include intruder tracking using a naive polynomial-regression algorithm.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The move towards IT outsourcing is the first step towards an environment where compute infrastructure is treated as a service. In utility computing this IT service has to honor Service Level Agreements (SLA) in order to meet the desired Quality of Service (QoS) guarantees. Such an environment requires reliable services in order to maximize the utilization of the resources and to decrease the Total Cost of Ownership (TCO). Such reliability cannot come at the cost of resource duplication, since it increases the TCO of the data center and hence the cost per compute unit. We, in this paper, look into aspects of projecting impact of hardware failures on the SLAs and techniques required to take proactive recovery steps in case of a predicted failure. By maintaining health vectors of all hardware and system resources, we predict the failure probability of resources based on observed hardware errors/failure events, at runtime. This inturn influences an availability aware middleware to take proactive action (even before the application is affected in case the system and the application have low recoverability). The proposed framework has been prototyped on a system running HP-UX. Our offline analysis of the prediction system on hardware error logs indicate no more than 10% false positives. This work to the best of our knowledge is the first of its kind to perform an end-to-end analysis of the impact of a hardware fault on application SLAs, in a live system.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

As research becomes more and more interdisciplinary, literature search from CD-ROM databases is often carried out on more than one CD-ROM database. This results in retrieving duplicate records due to same literature being covered (indexed) in more than one database. The retrieval software does not identify such duplicate records. Three different programs have been written to accomplish the task of identifying the duplicate records. These programs are executed from a shell script to minimize manual intervention. The various fields that have been used (extracted) to identify the duplicate records include the article title, year, volume number, issue number and pagination. The shell script when executed prompts for input file that may contain duplicate records. The programs identify the duplicate records and write them to a new file.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Pricing is an effective tool to control congestion and achieve quality of service (QoS) provisioning for multiple differentiated levels of service. In this paper, we consider the problem of pricing for congestion control in the case of a network of nodes under a single service class and multiple queues, and present a multi-layered pricing scheme. We propose an algorithm for finding the optimal state dependent price levels for individual queues, at each node. The pricing policy used depends on a weighted average queue length at each node. This helps in reducing frequent price variations and is in the spirit of the random early detection (RED) mechanism used in TCP/IP networks. We observe in our numerical results a considerable improvement in performance using our scheme over that of a recently proposed related scheme in terms of both throughput and delay performance. In particular, our approach exhibits a throughput improvement in the range of 34 to 69 percent in all cases studied (over all routes) over the above scheme.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper presents the development of a neural network based power system stabilizer (PSS) designed to enhance the damping characteristics of a practical power system network representing a part of Electricity Generating Authority of Thailand (EGAT) system. The proposed PSS consists of a neuro-identifier and a neuro-controller which have been developed based on functional link network (FLN) model. A recursive on-line training algorithm has been utilized to train the two neural networks. Simulation results have been obtained under various operating conditions and severe disturbance cases which show that the proposed neuro-PSS can provide a better damping to the local as well as interarea modes of oscillations as compared to a conventional PSS

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Carbon nanotubes dispersed in polymer matrix have been aligned in the form of fibers and interconnects and cured electrically and by UV light. Conductivity and effective semiconductor tunneling against reverse to forward bias field have been designed to have differentiable current-voltage response of each of the fiber/channel. The current-voltage response is a function of the strain applied to the fibers along axial direction. Biaxial and shear strains are correlated by differentiating signals from the aligned fibers/channels. Using a small doping of magnetic nanoparticles in these composite fibers, magneto-resistance properties are realized which are strong enough to use the resulting magnetostriction as a state variable for signal processing and computing. Various basic analog signal processing tasks such as addition, convolution and filtering etc. can be performed. These preliminary study shows promising application of the concept in combined analog-digital computation in carbon nanotube based fibers. Various dynamic effects such as relaxation, electric field dependent nonlinearities and hysteresis on the output signals are studied using experimental data and analytical model.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper primarily intends to develop a GIS (geographical information system)-based data mining approach for optimally selecting the locations and determining installed capacities for setting up distributed biomass power generation systems in the context of decentralized energy planning for rural regions. The optimal locations within a cluster of villages are obtained by matching the installed capacity needed with the demand for power, minimizing the cost of transportation of biomass from dispersed sources to power generation system, and cost of distribution of electricity from the power generation system to demand centers or villages. The methodology was validated by using it for developing an optimal plan for implementing distributed biomass-based power systems for meeting the rural electricity needs of Tumkur district in India consisting of 2700 villages. The approach uses a k-medoid clustering algorithm to divide the total region into clusters of villages and locate biomass power generation systems at the medoids. The optimal value of k is determined iteratively by running the algorithm for the entire search space for different values of k along with demand-supply matching constraints. The optimal value of the k is chosen such that it minimizes the total cost of system installation, costs of transportation of biomass, and transmission and distribution. A smaller region, consisting of 293 villages was selected to study the sensitivity of the results to varying demand and supply parameters. The results of clustering are represented on a GIS map for the region.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Chebyshev-inequality-based convex relaxations of Chance-Constrained Programs (CCPs) are shown to be useful for learning classifiers on massive datasets. In particular, an algorithm that integrates efficient clustering procedures and CCP approaches for computing classifiers on large datasets is proposed. The key idea is to identify high density regions or clusters from individual class conditional densities and then use a CCP formulation to learn a classifier on the clusters. The CCP formulation ensures that most of the data points in a cluster are correctly classified by employing a Chebyshev-inequality-based convex relaxation. This relaxation is heavily dependent on the second-order statistics. However, this formulation and in general such relaxations that depend on the second-order moments are susceptible to moment estimation errors. One of the contributions of the paper is to propose several formulations that are robust to such errors. In particular a generic way of making such formulations robust to moment estimation errors is illustrated using two novel confidence sets. An important contribution is to show that when either of the confidence sets is employed, for the special case of a spherical normal distribution of clusters, the robust variant of the formulation can be posed as a second-order cone program. Empirical results show that the robust formulations achieve accuracies comparable to that with true moments, even when moment estimates are erroneous. Results also illustrate the benefits of employing the proposed methodology for robust classification of large-scale datasets.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Functions are important in designing. However, several issues hinder progress with the understanding and usage of functions: lack of a clear and overarching definition of function, lack of overall justifications for the inevitability of the multiple views of function, and scarcity of systematic attempts to relate these views with one another. To help resolve these, the objectives of this research are to propose a common definition of function that underlies the multiple views in literature and to identify and validate the views of function that are logically justified to be present in designing. Function is defined as a change intended by designers between two scenarios: before and after the introduction of the design. A framework is proposed that comprises the above definition of function and an empirically validated model of designing, extended generate, evaluate, modify, and select of state-change, and an action, part, phenomenon, input, organ, and effect model of causality (Known as GEMS of SAPPhIRE), comprising the views of activity, outcome, requirement-solution-information, and system-environment. The framework is used to identify the logically possible views of function in the context of designing and is validated by comparing these with the views of function in the literature. Describing the different views of function using the proposed framework should enable comparisons and determine relationships among the various views, leading to better understanding and usage of functions in designing.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: Six new cationic gemini lipids based on cholesterol possessing different positional combinations of hydroxyethyl (-CH2CH2OH) and oligo-oxyethylene -(CH2CH2O)(n)- moieties were synthesized. For comparison the corresponding monomeric lipid was also prepared. Each new cationic lipid was found to form stable, clear suspensions in aqueous media. Methodology/Principal Findings: To understand the nature of the individual lipid aggregates, we have studied the aggregation properties using transmission electron microscopy (TEM), dynamic light scattering (DLS), zeta potential measurements and X-ray diffraction (XRD). We studied the lipid/DNA complex (lipoplex) formation and the release of the DNA from such lipoplexes using ethidium bromide. These gemini lipids in presence of a helper lipid, 1, 2-dioleoyl phophatidyl ethanol amine (DOPE) showed significant enhancements in the gene transfection compared to several commercially available transfection agents. Cholesterol based gemini having -CH2-CH2-OH groups at the head and one oxyethylene spacer was found to be the most effective lipid, which showed transfection activity even in presence of high serum levels (50%) greater than Effectene, one of the potent commercially available transfecting agents. Most of these geminis protected plasmid DNA remarkably against DNase I in serum, although the degree of stability was found to vary with their structural features. Conclusions/Significance: -OH groups present on the cationic headgroups in combination with oxyethylene linkers on cholesterol based geminis, gave an optimized combination of new genera of gemini lipids possessing high transfection efficiency even in presence of very high percentage of serum. This property makes them preferential transfection reagents for possible in vivo studies.