37 resultados para Issue of housing

em Indian Institute of Science - Bangalore - Índia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is a policy of Solid State Communications’ Executive Editorial Board to organize special issues from time to time on topics of current interests. The present issue focuses on soft condensed matter, a rapidly developing and diverse area of importance not only for the basic science, but also for its potential applications. The ten articles in this issue are intended to give the readers a snapshot of some latest developments in soft condensed matter, mainly from the point of view of basic science. As the special issues are intended for a broad audience, most articles are short reviews that introduce the readers to the relevant topics. Hence this special issue can be especially helpful to readers who might not be specialists in this area but would like to have a quick grasp on some of the interesting research directions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Numerical simulation of separated flows in rocket nozzles is challenging because existing turbulence models are unable to predict it correctly. This paper addresses this issue with the Spalart-Allmaras and Shear Stress Transport (SST) eddy-viscosity models, which predict flow separation with moderate success. Their performances have been compared against experimental data for a conical and two contoured subscale nozzles. It is found that they fail to predict the separation location correctly, exhibiting sensitivity to the nozzle pressure ratio (NPR) and nozzle type. A careful assessment indicated how the model had to be tuned for better, consistent prediction. It is learnt that SST model's failure is caused by limiting of the shear stress inside boundary layer according to Bradshaw's assumption, and by over prediction of jet spreading rate. Accordingly, SST's coefficients were empirically modified to match the experimental wall pressure data. Results confirm that accurate RANS prediction of separation depends on the correct capture of the jet spreading rate, and that it is feasible over a wide range of NPRs by modified values of the diffusion coefficients in the turbulence model. (C) 2015 Elsevier Masson SAS. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The measurement of surface energy balance over a land surface in an open area in Bangalore is reported. Measurements of all variables needed to calculate the surface energy balance on time scales longer than a week are made. Components of radiative fluxes are measured while sensible and latent heat fluxes are based on the bulk method using measurements made at two levels on a micrometeorological tower of 10 m height. The bulk flux formulation is verified by comparing its fluxes with direct fluxes using sonic anemometer data sampled at 10 Hz. Soil temperature is measured at 4 depths. Data have been continuously collected for over 6 months covering pre-monsoon and monsoon periods during the year 2006. The study first addresses the issue of getting the fluxes accurately. It is shown that water vapour measurements are the most crucial. A bias of 0.25% in relative humidity, which is well above the normal accuracy assumed the manufacturers but achievable in the field using a combination of laboratory calibration and field intercomparisons, results in about 20 W m(-2) change in the latent heat flux on the seasonal time scale. When seen on the seasonal time scale, the net longwave radiation is the largest energy loss term at the experimental site. The seasonal variation in the energy sink term is small compared to that in the energy source term.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A desalination system is a complex multi energy domain system comprising power/energy flow across several domains such as electrical, thermal, and hydraulic. The dynamic modeling of a desalination system that comprehensively addresses all these multi energy domains is not adequately addressed in the literature. This paper proposes to address the issue of modeling the various energy domains for the case of a single stage flash evaporation desalination system. This paper presents a detailed bond graph modeling of a desalination unit with seamless integration of the power flow across electrical, thermal, and hydraulic domains. The paper further proposes a performance index function that leads to the tracking of the optimal chamber pressure giving the optimal flow rate for a given unit of energy expended. The model has been validated in steady state conditions by simulation and experimentation.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Concurrency control (CC) algorithms are important in distributed database systems to ensure consistency of the database. A number of such algorithms are available in the literature. The issue of performance evaluation of these algorithms has been recognized to be important. However, only a few studies have been carried out towards this. This paper deals with the performance evaluation of a CC algorithm proposed by Rosenkrantz et al. through a detailed simulation study. In doing so, the algorithm has been modified so that it can, within itself, take care of the redundancy in the database. The influences of various system parameters and the transaction profile on the response time and on the degree of conflict are considered. The entire study has been carried out using the programming language SIMULA on a DEC-1090 system.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The issue of dynamic spectrum scene analysis in any cognitive radio network becomes extremely complex when low probability of intercept, spread spectrum systems are present in environment. The detection and estimation become more complex if frequency hopping spread spectrum is adaptive in nature. In this paper, we propose two phase approach for detection and estimation of frequency hoping signals. Polyphase filter bank has been proposed as the architecture of choice for detection phase to efficiently detect the presence of frequency hopping signal. Based on the modeling of frequency hopping signal it can be shown that parametric methods of line spectral analysis are well suited for estimation of frequency hopping signals if the issues of order estimation and time localization are resolved. An algorithm using line spectra parameter estimation and wavelet based transient detection has been proposed which resolves above issues in computationally efficient manner suitable for implementation in cognitive radio. The simulations show promising results proving that adaptive frequency hopping signals can be detected and demodulated in a non cooperative context, even at a very low signal to noise ratio in real time.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We address the issue of noise robustness of reconstruction techniques for frequency-domain optical-coherence tomography (FDOCT). We consider three reconstruction techniques: Fourier, iterative phase recovery, and cepstral techniques. We characterize the reconstructions in terms of their statistical bias and variance and obtain approximate analytical expressions under the assumption of small noise. We also perform Monte Carlo analyses and show that the experimental results are in agreement with the theoretical predictions. It turns out that the iterative and cepstral techniques yield reconstructions with a smaller bias than the Fourier method. The three techniques, however, have identical variance profiles, and their consistency increases linearly as a function of the signal-to-noise ratio.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

We address the issue of rate-distortion (R/D) performance optimality of the recently proposed switched split vector quantization (SSVQ) method. The distribution of the source is modeled using Gaussian mixture density and thus, the non-parametric SSVQ is analyzed in a parametric model based framework for achieving optimum R/D performance. Using high rate quantization theory, we derive the optimum bit allocation formulae for the intra-cluster split vector quantizer (SVQ) and the inter-cluster switching. For the wide-band speech line spectrum frequency (LSF) parameter quantization, it is shown that the Gaussian mixture model (GMM) based parametric SSVQ method provides 1 bit/vector advantage over the non-parametric SSVQ method.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The cricket is one of most popular games in the Asian subcontinent and its popularity is increasing every day. The issue of replacement of the cricket ball amidst the matches is always an uncomfortable situation for teams, umpires and even supporters. At present the basis of the replacement is solely on the judgement, experience and expertise of the umpires, which is subjective, controversial and debatable. In this paper, we have attempted a new approach to quantify the number of impacts or impact factor of a 4-piece leather ball used in the Intemational one-day and test cricket matches. This gives a more objective and scientific basis/ criteria for the replacement of the ball. Here, we have used a well known and widely used Thermal Infra-Red (TIR) imaging to capture the dynamics of the thermal profice of the cricket ball, which has been heated for about 15 seconds. The idea behind this approach is the simple observation that an old ball (ball with a few impacts) has different thermal signature/profice compared to the that of a new ball. This could be due to the change in the surface profice and internal structure, minor de-shaping, opening of seam etc. The TIR video and its frames, which is inherently noisy, are restored using Hebbian learning based FIR (sic), which performs optimal smoothing in relatively less number of iteration. We have focussed on the hottest region of the ball i.e., the inner core and tracked its thermal profice dynamics. Finally we have used multi layer perceptron model (MLP) to quantify the impact factor with fairly good accuracy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In wireless ad hoc networks, nodes communicate with far off destinations using intermediate nodes as relays. Since wireless nodes are energy constrained, it may not be in the best interest of a node to always accept relay requests. On the other hand, if all nodes decide not to expend energy in relaying, then network throughput will drop dramatically. Both these extreme scenarios (complete cooperation and complete noncooperation) are inimical to the interests of a user. In this paper, we address the issue of user cooperation in ad hoc networks. We assume that nodes are rational, i.e., their actions are strictly determined by self interest, and that each node is associated with a minimum lifetime constraint. Given these lifetime constraints and the assumption of rational behavior, we are able to determine the optimal share of service that each node should receive. We define this to be the rational Pareto optimal operating point. We then propose a distributed and scalable acceptance algorithm called Generous TIT-FOR-TAT (GTFT). The acceptance algorithm is used by the nodes to decide whether to accept or reject a relay request. We show that GTFT results in a Nash equilibrium and prove that the system converges to the rational and optimal operating point.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

In this paper we address the fundamental issue of temperature fluctuation during the thermal denaturation (or the unzipping of the two strands on heating) of double stranded (ds) DNA. From our experiments we observe the presence of extremely high thermal fluctuations during DNA denaturation. This thermal fluctuation is several orders higher than the thermal fluctuation at temperatures away from the denaturation temperature range. This fluctuation is absent in single stranded (ss) DNA. The magnitude of fluctuation is much higher in heteropolymeric DNA and is almost absent in short homopolymeric DNA fragments. The temperature range over which the denaturation occurs (i.e., over which the thermal fluctuation is large) depends on the length of the DNA and is largest for the longest DNA.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The present study was designed to improve the bioavailability of forskolin by the influence of precorneal residence time and dissolution characteristics. Nanosizing is an advanced approach to overcome the issue of poor aqueous solubility of active pharmaceutical ingredients. Forskolin nanocrystals have been successfully manufactured and stabilized by poloxamer 407. These nanocrystals have been characterized in terms of particle size by scanning electron microscopy and dynamic light scattering. By formulating Noveon AA-1 polycarbophil/poloxamer 407 platforms, at specific concentrations, it was possible to obtain a pH and thermoreversible gel with a pH(gel)/T-gel close to eye pH/temperature. The addition of forskolin nanocrystals did not alter the gelation properties of Noveon AA-1 polycarbophil/poloxamer 407 and nanocrystal properties of forskolin. The formulation was stable over a period of 6 months at room temperature. In vitro release experiments indicated that the optimized platform was able to prolong and control forskolin release for more than 5 h. The in vivo studies on dexamethasone-induced glaucomatous rabbits indicated that the intraocular pressure lowering efficacy for nanosuspension/hydrogel systems was 31% and lasted for 12 h, which is significantly better than the effect of traditional eye suspension (18%, 4-6 h). Hence, our investigations successfully prove that the pH and thermoreversible polymeric in situ gel-forming nanosuspension with ability of controlled drug release exhibits a greater potential for glaucoma therapy.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Conventional three-dimensional isoparametric elements are susceptible to problems of locking when used to model plate/shell geometries or when the meshes are distorted etc. Hybrid elements that are based on a two-field variational formulation are immune to most of these problems, and hence can be used to efficiently model both "chunky" three-dimensional and plate/shell type structures. Thus, only one type of element can be used to model "all" types of structures, and also allows us to use a standard dual algorithm for carrying out the topology optimization of the structure. We also address the issue of manufacturability of the designs.