121 resultados para Computational experiment
Resumo:
Computations have been carried out for simulating supersonic flow through a set of converging-diverging nozzles with their expanding jets forming a laser cavity and flow patterns through diffusers, past the cavity. A thorough numerical investigation with 3-D RANS code is carried out to capture the flow distribution which comprises of shock patterns and multiple supersonic jet interactions. The analysis of pressure recovery characteristics during the flow through the diffusers is an important parameter of the simulation and is critical for the performance of the laser device. The results of the computation have shown a close agreement with the experimentally measured parameters as well as other established results indicating that the flow analysis done is found to be satisfactory.
Resumo:
The work reported here is concerned with a detailed thermochemical evaluation of the flaming mode behaviour of a gasifier based stove. Determination of the gas composition over the fuel bed, surface and gas temperatures in the gasification process constitute principal experimental features. A simple atomic balance for the gasification reaction combined with the gas composition from the experiments is used to determine the CH(4) equivalent of higher hydrocarbons and the gasification efficiency (eta g). The components of utilization efficiency, namely, gasification-combustion and heat transfer are explored. Reactive flow computational studies using the measured gas composition over the fuel bed are used to simulate the thermochemical flow field and heat transfer to the vessel; hither-to-ignored vessel size effects in the extraction of heat from the stove are established clearly. The overall flaming mode efficiency of the stove is 50-54%; the convective and radiative components of heat transfer are established to be 45-47 and 5-7% respectively. The efficiency estimates from reacting computational fluid dynamics (RCFD) compare well with experiments. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Saplings of forty nine species of trees from Western Ghats forests were planted on a 1.5 hectare tract of Deccan plateau (in the campus of Indian Institute of Science, Bangalore) and their performance monitored for 23 years. The objective was to evaluate their adaptability to a habitat and conditions apparently alien to these species. The study was also meant to understand the linkages of these trees with the surrounding environment. Contrary to the belief that tree species are very sensitive to change of location and conditions, the introduced trees have grown as good as they would do in their native habitat and maintained their phenology. Further, they have grown in perfect harmony with trees native to the location. The results show that the introduced species are opportunistic and readily acclimatized and grew well overcoming the need for the edaphic and other factors that are believed to be responsible for their endemicity. Besides ex situ conservation, the creation of miniforest has other accrued ecosystem benefits. For instance, the ground water level has risen and the ambient temperature has come down by two degrees.
Resumo:
A new structured discretization of 2D space, named X-discretization, is proposed to solve bivariate population balance equations using the framework of minimal internal consistency of discretization of Chakraborty and Kumar [2007, A new framework for solution of multidimensional population balance equations. Chem. Eng. Sci. 62, 4112-4125] for breakup and aggregation of particles. The 2D space of particle constituents (internal attributes) is discretized into bins by using arbitrarily spaced constant composition radial lines and constant mass lines of slope -1. The quadrilaterals are triangulated by using straight lines pointing towards the mean composition line. The monotonicity of the new discretization makes is quite easy to implement, like a rectangular grid but with significantly reduced numerical dispersion. We use the new discretization of space to automate the expansion and contraction of the computational domain for the aggregation process, corresponding to the formation of larger particles and the disappearance of smaller particles by adding and removing the constant mass lines at the boundaries. The results show that the predictions of particle size distribution on fixed X-grid are in better agreement with the analytical solution than those obtained with the earlier techniques. The simulations carried out with expansion and/or contraction of the computational domain as population evolves show that the proposed strategy of evolving the computational domain with the aggregation process brings down the computational effort quite substantially; larger the extent of evolution, greater is the reduction in computational effort. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Current scientific research is characterized by increasing specialization, accumulating knowledge at a high speed due to parallel advances in a multitude of sub-disciplines. Recent estimates suggest that human knowledge doubles every two to three years – and with the advances in information and communication technologies, this wide body of scientific knowledge is available to anyone, anywhere, anytime. This may also be referred to as ambient intelligence – an environment characterized by plentiful and available knowledge. The bottleneck in utilizing this knowledge for specific applications is not accessing but assimilating the information and transforming it to suit the needs for a specific application. The increasingly specialized areas of scientific research often have the common goal of converting data into insight allowing the identification of solutions to scientific problems. Due to this common goal, there are strong parallels between different areas of applications that can be exploited and used to cross-fertilize different disciplines. For example, the same fundamental statistical methods are used extensively in speech and language processing, in materials science applications, in visual processing and in biomedicine. Each sub-discipline has found its own specialized methodologies making these statistical methods successful to the given application. The unification of specialized areas is possible because many different problems can share strong analogies, making the theories developed for one problem applicable to other areas of research. It is the goal of this paper to demonstrate the utility of merging two disparate areas of applications to advance scientific research. The merging process requires cross-disciplinary collaboration to allow maximal exploitation of advances in one sub-discipline for that of another. We will demonstrate this general concept with the specific example of merging language technologies and computational biology.
Resumo:
In this paper we have developed methods to compute maps from differential equations. We take two examples. First is the case of the harmonic oscillator and the second is the case of Duffing's equation. First we convert these equations to a canonical form. This is slightly nontrivial for the Duffing's equation. Then we show a method to extend these differential equations. In the second case, symbolic algebra needs to be used. Once the extensions are accomplished, various maps are generated. The Poincare sections are seen as a special case of such generated maps. Other applications are also discussed.
Resumo:
We briefly review the growth and structural properties of View the MathML source bulk single crystals and View the MathML source epitaxial films grown on semi-insulating GaAs substrates. Temperature-dependent transport measurements on these samples are then correlated with the information obtained from structural (XRD, TEM, SEM) and optical (FTIR absorption) investigations. The temperature dependence of mobility and the Hall coefficient are theoretically modelled by exactly solving the linearized Boltzmann transport equation by inversion of the collision matrix and the relative role of various scattering mechanisms in limiting the low temperature and View the MathML source mobility is estimated. Finally, the first observation of Shubnikov oscillations in InAsSb is discussed.
Resumo:
The variation in temperature and concentration plays a crucial role in predicting the final microstructure during solidification of a binary alloy. Most of the experimental techniques used to measure concentration and temperature are intrusive in nature and affect the flow field. In this paper, the main focus is laid on in-situ, non-intrusive, transient measurement of concentration and temperature during the solidification of a binary mixture of aqueous ammonium chloride solution (a metal-analog system) in a top cooled cavity using laser based Mach-Zehnder Interferometric technique. It was found from the interferogram, that the angular deviation of fringe pattern and the total number of fringes exhibit significant sensitivity to refractive index and hence are functions of the local temperature and concentration of the NH4Cl solution inside the cavity. Using the fringe characteristics, calibration curves were established for the range of temperature and concentration levels expected during the solidification process. In the actual solidification experiment, two hypoeutectic solutions (5% and 15% NH4Cl) were chosen. The calibration curves were used to determine the temperature and concentration of the solution inside the cavity during solidification of 5% and 15% NH4Cl solution at different instants of time. The measurement was carried out at a fixed point in the cavity, and the concentration variation with time was recorded as the solid-liquid interface approached the measurement point. The measurement exhibited distinct zones of concentration distribution caused by solute rejection and Rayleigh Benard convection. Further studies involving flow visualization with laser scattering confirmed the Rayleigh Benard convection. Computational modeling was also performed, which corroborated the experimental findings. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
In this paper, the diversity-multiplexing gain tradeoff (DMT) of single-source, single-sink (ss-ss), multihop relay networks having slow-fading links is studied. In particular, the two end-points of the DMT of ss-ss full-duplex networks are determined, by showing that the maximum achievable diversity gain is equal to the min-cut and that the maximum multiplexing gain is equal to the min-cut rank, the latter by using an operational connection to a deterministic network. Also included in the paper, are several results that aid in the computation of the DMT of networks operating under amplify-and-forward (AF) protocols. In particular, it is shown that the colored noise encountered in amplify-and-forward protocols can be treated as white for the purpose of DMT computation, lower bounds on the DMT of lower-triangular channel matrices are derived and the DMT of parallel MIMO channels is computed. All protocols appearing in the paper are explicit and rely only upon AF relaying. Half-duplex networks and explicit coding schemes are studied in a companion paper.
Resumo:
The present study reports a two dimensional NMR experiment which separates single quantum spectra of enantiomers from that of a racemic mixture. This is a blend of selective double quantum refocusing, for resolving couplings and chemical shift interactions along two dimensions followed by correlation of the selectively excited protons to the entire coupled spin network. The concept is solely based on the presence of distinct intra methyl dipolar couplings of different enantiomers when dissolved in chiral orienting media. The analysis of single enantiomer spectrum obtained from respective F-2 cross sections yield all the spectral information. (C) 2011 Elsevier Inc. All rights reserved.