13 resultados para Top-down control

em Indian Institute of Science - Bangalore - Índia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Database schemes can be viewed as hypergraphs with individual relation schemes corresponding to the edges of a hypergraph. Under this setting, a new class of "acyclic" database schemes was recently introduced and was shown to have a claim to a number of desirable properties. However, unlike the case of ordinary undirected graphs, there are several unequivalent notions of acyclicity of hypergraphs. Of special interest among these are agr-, beta-, and gamma-, degrees of acyclicity, each characterizing an equivalence class of desirable properties for database schemes, represented as hypergraphs. In this paper, two complementary approaches to designing beta-acyclic database schemes have been presented. For the first part, a new notion called "independent cycle" is introduced. Based on this, a criterion for beta-acyclicity is developed and is shown equivalent to the existing definitions of beta-acyclicity. From this and the concept of the dual of a hypergraph, an efficient algorithm for testing beta-acyclicity is developed. As for the second part, a procedure is evolved for top-down generation of beta-acyclic schemes and its correctness is established. Finally, extensions and applications of ideas are described.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Among all methods of metal alloy slurry preparation, the cooling slope method is the simplest in terms of design and process control. The method involves pouring of the melt from top, down an oblique and channel shaped plate cooled from bottom by counter flowing water. The melt, while flowing down, partially solidifies and forms columnar dendrites on plate wall. These dendrites are broken into equiaxed grains and are washed away with melt. The melt, together with the equiaxed grains, forms semisolid slurry collected at the slope exit and cast into billets having non-dendritic microstructure. The final microstructure depends on several process parameters such as slope angle, slope length, pouring superheat, and cooling rate. The present work involves scaling analysis of conservation equations of momentum, energy and species for the melt flow down a cooling slope. The main purpose of the scaling analysis is to obtain a physical insight into the role and relative importance of each parameter in influencing the final microstructure. For assessing the scaling analysis, the trends predicted by scaling are compared against corresponding numerical results using an enthalpy based solidification model with incorporation of solid phase movement.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Use of fuel other than woody generally has been limited to rice husk and other residues are rarely tried as a fuel in a gasification system. With the availability of woody biomass in most countries like India, alternates fuels are being explored for sustainable supply of fuel. Use of agro residues has been explored after briquetting. There are few feedstock's like coconut fronts, maize cobs, etc, that might require lesser preprocessing steps compared to briquetting. The paper presents a detailed investigation into using coconut fronds as a fuel in an open top down draft gasification system. The fuel has ash content of 7% and was dried to moisture levels of 12 %. The average bulk density was found to be 230 kg/m3 with a fuel size particle of an average size 40 mm as compared to 350 kg/m3 for a standard wood pieces. A typical dry coconut fronds weighs about 2.5kgs and on an average 6 m long and 90 % of the frond is the petiole which is generally used as a fuel. The focus was also to compare the overall process with respect to operating with a typical woody biomass like subabul whose ash content is 1 %. The open top gasification system consists of a reactor, cooling and cleaning system along with water treatment. The performance parameters studied were the gas composition, tar and particulates in the clean gas, water quality and reactor pressure drop apart from other standard data collection of fuel flow rate, etc. The average gas composition was found to be CO 15 1.0 % H-2 16 +/- 1% CH4 0.5 +/- 0.1 % CO2 12.0 +/- 1.0 % and rest N2 compared to CO 19 +/- 1.0 % H-2 17 +/- 1.0 %, CH4 1 +/- 0.2 %, CO2 12 +/- 1.0 % and rest N2. The tar and particulate content in the clean gas has been found to be about 10 and 12 mg/m3 in both cases. The presence of high ash content material increased the pressure drop with coconut frond compared to woody biomass.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A computational algorithm (based on Smullyan's analytic tableau method) that varifies whether a given well-formed formula in propositional calculus is a tautology or not has been implemented on a DEC system 10. The stepwise refinement approch of program development used for this implementation forms the subject matter of this paper. The top-down design has resulted in a modular and reliable program package. This computational algoritlhm compares favourably with the algorithm based on the well-known resolution principle used in theorem provers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The method of structured programming or program development using a top-down, stepwise refinement technique provides a systematic approach for the development of programs of considerable complexity. The aim of this paper is to present the philosophy of structured programming through a case study of a nonnumeric programming task. The problem of converting a well-formed formula in first-order logic into prenex normal form is considered. The program has been coded in the programming language PASCAL and implemented on a DEC-10 system. The program has about 500 lines of code and comprises 11 procedures.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The worldwide research in nanoelectronics is motivated by the fact that scaling of MOSFETs by conventional top down approach will not continue for ever due to fundamental limits imposed by physics even if it is delayed for some more years. The research community in this domain has largely become multidisciplinary trying to discover novel transistor structures built with novel materials so that semiconductor industry can continue to follow its projected roadmap. However, setting up and running a nanoelectronics facility for research is hugely expensive. Therefore it is a common model to setup a central networked facility that can be shared with large number of users across the research community. The Centres for Excellence in Nanoelectronics (CEN) at Indian Institute of Science, Bangalore (IISc) and Indian Institute of Technology, Bombay (IITB) are such central networked facilities setup with funding of about USD 20 million from the Department of Information Technology (DIT), Ministry of Communications and Information Technology (MCIT), Government of India, in 2005. Indian Nanoelectronics Users Program (INUP) is a missionary program not only to spread awareness and provide training in nanoelectronics but also to provide easy access to the latest facilities at CEN in IISc and at IITB for the wider nanoelectronics research community in India. This program, also funded by MCIT, aims to train researchers by conducting workshops, hands-on training programs, and providing access to CEN facilities. This is a unique program aiming to expedite nanoelectronics research in the country, as the funding for projects required for projects proposed by researchers from around India has prior financial approval from the government and requires only technical approval by the IISc/ IITB team. This paper discusses the objectives of INUP, gives brief descriptions of CEN facilities, the training programs conducted by INUP and list various research activities currently under way in the program.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we present a new algorithm for learning oblique decision trees. Most of the current decision tree algorithms rely on impurity measures to assess the goodness of hyperplanes at each node while learning a decision tree in top-down fashion. These impurity measures do not properly capture the geometric structures in the data. Motivated by this, our algorithm uses a strategy for assessing the hyperplanes in such a way that the geometric structure in the data is taken into account. At each node of the decision tree, we find the clustering hyperplanes for both the classes and use their angle bisectors as the split rule at that node. We show through empirical studies that this idea leads to small decision trees and better performance. We also present some analysis to show that the angle bisectors of clustering hyperplanes that we use as the split rules at each node are solutions of an interesting optimization problem and hence argue that this is a principled method of learning a decision tree.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The effect of attention on firing rates varies considerably within a single cortical area. The firing rate of some neurons is greatly modulated by attention while others are hardly affected. The reason for this variability across neurons is unknown. We found that the variability in attention modulation across neurons in area MT of macaques can be well explained by variability in the strength of tuned normalization across neurons. The presence of tuned normalization also explains a striking asymmetry in attention effects within neurons: when two stimuli are in a neuron's receptive field, directing attention to the preferred stimulus modulates firing rates more than directing attention to the nonpreferred stimulus. These findings show that much of the neuron-to-neuron variability in modulation of responses by attention depends on variability in the way the neurons process multiple stimuli, rather than differences in the influence of top-down signals related to attention.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The solvated metal atom dispersion (SMAD) method has been used for the synthesis of colloids of metal nanoparticles. It is a top-down approach involving condensation of metal atoms in low temperature solvent matrices in a SMAD reactor maintained at 77 K. Warming of the matrix results in a slurry of metal atoms that interact with one another to form particles that grow in size. The organic solvent solvates the particles and acts as a weak capping agent to halt/slow down the growth process to a certain extent. This as-prepared colloid consists of metal nanoparticles that are quite polydisperse. In a process termed as digestive ripening, addition of a capping agent to the as-prepared colloid which is polydisperse renders it highly monodisperse either under ambient or thermal conditions. In this, as yet not well-understood process, smaller particles grow and the larger ones diminish in size until the system attains uniformity in size and a dynamic equilibrium is established. Using the SMAD method in combination with digestive ripening process, highly monodisperse metal, core-shell, alloy, and composite nanoparticles have been synthesized. This article is a review of our contributions together with some literature reports on this methodology to realize various nanostructured materials.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Synfire waves are propagating spike packets in synfire chains, which are feedforward chains embedded in random networks. Although synfire waves have proved to be effective quantification for network activity with clear relations to network structure, their utilities are largely limited to feedforward networks with low background activity. To overcome these shortcomings, we describe a novel generalisation of synfire waves, and define `synconset wave' as a cascade of first spikes within a synchronisation event. Synconset waves would occur in `synconset chains', which are feedforward chains embedded in possibly heavily recurrent networks with heavy background activity. We probed the utility of synconset waves using simulation of single compartment neuron network models with biophysically realistic conductances, and demonstrated that the spread of synconset waves directly follows from the network connectivity matrix and is modulated by top-down inputs and the resultant oscillations. Such synconset profiles lend intuitive insights into network organisation in terms of connection probabilities between various network regions rather than an adjacency matrix. To test this intuition, we develop a Bayesian likelihood function that quantifies the probability that an observed synfire wave was caused by a given network. Further, we demonstrate it's utility in the inverse problem of identifying the network that caused a given synfire wave. This method was effective even in highly subsampled networks where only a small subset of neurons were accessible, thus showing it's utility in experimental estimation of connectomes in real neuronal-networks. Together, we propose synconset chains/waves as an effective framework for understanding the impact of network structure on function, and as a step towards developing physiology-driven network identification methods. Finally, as synconset chains extend the utilities of synfire chains to arbitrary networks, we suggest utilities of our framework to several aspects of network physiology including cell assemblies, population codes, and oscillatory synchrony.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Two atmospheric inversions (one fine-resolved and one process-discriminating) and a process-based model for land surface exchanges are brought together to analyse the variations of methane emissions from 1990 to 2009. A focus is put on the role of natural wetlands and on the years 2000-2006, a period of stable atmospheric concentrations. From 1990 to 2000, the top-down and bottom-up visions agree on the time-phasing of global total and wetland emission anomalies. The process-discriminating inversion indicates that wetlands dominate the time-variability of methane emissions (90% of the total variability). The contribution of tropical wetlands to the anomalies is found to be large, especially during the post-Pinatubo years (global negative anomalies with minima between -41 and -19 Tg yr(-1) in 1992) and during the alternate 1997-1998 El-Nino/1998-1999 La-Nina (maximal anomalies in tropical regions between +16 and +22 Tg yr(-1) for the inversions and anomalies due to tropical wetlands between +12 and +17 Tg yr(-1) for the process-based model). Between 2000 and 2006, during the stagnation of methane concentrations in the atmosphere, the top-down and bottom-up approaches agree on the fact that South America is the main region contributing to anomalies in natural wetland emissions, but they disagree on the sign and magnitude of the flux trend in the Amazon basin. A negative trend (-3.9 +/- 1.3 Tg yr(-1)) is inferred by the process-discriminating inversion whereas a positive trend (+1.3 +/- 0.3 Tg yr(-1)) is found by the process model. Although processed-based models have their own caveats and may not take into account all processes, the positive trend found by the B-U approach is considered more likely because it is a robust feature of the process-based model, consistent with analysed precipitations and the satellite-derived extent of inundated areas. On the contrary, the surface-data based inversions lack constraints for South America. This result suggests the need for a re-interpretation of the large increase found in anthropogenic methane inventories after 2000.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Airlines have successfully practiced revenue management over the past four decades and enhanced their revenue. Most of the traditional models that are applied assume that customers buying a high-fare class ticket will not purchase a low-fare class ticket even if it is available. This is not a very realistic assumption and has led to revenue leakage due to customers exhibiting buy-down behaviour. This paper aims at devising a suitable incentive mechanism that would incite the customer to reveal his nature. This helps in reducing revenue leakage. We show that the proposed incentive mechanism is profitable to both the buyer and seller and hence ensures the buyers participation in the mechanism. Journal of the Operational Research Society (2011) 62, 1566-1573. doi:10.1057/jors.2010.57 Published online 11 August 2010

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Two-axis micromanipulators, whose tip orientation and position can be controlled in real time in the scanning plane, enable versatile probing systems for 2.5-D nanometrology. The key to achieve high-precision probing systems is to accurately control the interaction point of the manipulator tip when its orientation is changed. This paper presents the development of a probing system wherein the deviation in the end point due to large orientation changes is controlled to within 10 nm. To achieve this, a novel micromanipulator design is first proposed, wherein the end point of the tip is located on the axis of rotation. Next, the residual tip motion caused by fabrication error and actuation crosstalk is modeled and a systematic method to compensate it is presented. The manipulator is fabricated and the performance of the developed scheme to control tip position during orientation change is experimentally validated. Subsequently, the two-axis probing system is demonstrated to scan the full top surface of a micropipette down to a diameter of 300 nm.