242 resultados para Covering Radius


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hypoeutectic boron addition (0.1 wt.%) to Ti-6Al-4V is known to cause significant refinement of the cast microstructure. In the present investigation, it has been observed that trace boron addition to Ti-6Al-4V alloy also ensures excellent microstructural homogeneity throughout the ingot. A subdued thermal gradient, related to the basic grain refinement mechanism by constitutional undercooling, persists during solidification for the boron-containing alloy and maintains equivalent beta grain growth kinetics at different locations in the ingot. The Ti-6Al-4V alloy shows relatively strong texture with preferred components (e.g. ingot axis parallel to[0 0 0 1] or [1 0 (1) over bar 0]) over the entire ingot and gradual transition of texture components along the radius. For Ti-6Al-4V-0.1B alloy, significant weakening characterizes both the high-temperature beta and room-temperature a texture. In addition to solidification factors that are responsible for weak beta texture development, microstructural differences due to boron addition, e.g. the absence of grain boundary alpha phase and presence of TiB particles, strongly affects the mechanism of beta -> alpha phase transformation and consequently weakens the alpha phase texture. Based on the understanding developed for the boron-modified alloy, a novel mechanism has been proposed for the microstructure and texture formation during solidification and phase transformation. (C) 2011 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During lightning strike to a tall grounded object (TGO), reflections of current waves are known to occur at either ends of the TGO. These reflection modify the channel current and hence, the lightning electromagnetic fields. This study aims to identify the possible contributing factors to reflection at a TGO-channel junction for the current waves ascending on the TGO. Possible sources of reflection identified are corona sheath and discontinuity of resistance and radius. For analyzing the contribution of corona sheath and discontinuity of resistance at the junction, a macroscopic physical model for the return stroke developed in our earlier work is employed. NEC-2D is used for assessing the contribution of abrupt change in radii at a TGO-channel junction. The wire-cage model adopted for the same is validated using laboratory experiments. Detailed investigation revealed the following. The main contributor for reflection at a TGO-channel junction is the difference between TGO and channel core radii. Also, the discontinuity of resistance at a TGO-channel junction can be of some relevance only for the first microsecond regime. Further, corona sheath does not play any significant role in the reflection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Wireless sensor networks can often be viewed in terms of a uniform deployment of a large number of nodes on a region in Euclidean space, e.g., the unit square. After deployment, the nodes self-organise into a mesh topology. In a dense, homogeneous deployment, a frequently used approximation is to take the hop distance between nodes to be proportional to the Euclidean distance between them. In this paper, we analyse the performance of this approximation. We show that nodes with a certain hop distance from a fixed anchor node lie within a certain annulus with probability approach- ing unity as the number of nodes n → ∞. We take a uniform, i.i.d. deployment of n nodes on a unit square, and consider the geometric graph on these nodes with radius r(n) = c q ln n n . We show that, for a given hop distance h of a node from a fixed anchor on the unit square,the Euclidean distance lies within [(1−ǫ)(h−1)r(n), hr(n)],for ǫ > 0, with probability approaching unity as n → ∞.This result shows that it is more likely to expect a node, with hop distance h from the anchor, to lie within this an- nulus centred at the anchor location, and of width roughly r(n), rather than close to a circle whose radius is exactly proportional to h. We show that if the radius r of the ge- ometric graph is fixed, the convergence of the probability is exponentially fast. Similar results hold for a randomised lattice deployment. We provide simulation results that il- lustrate the theory, and serve to show how large n needs to be for the asymptotics to be useful.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the feasibility of developing a comprehensive gate delay and slew models which incorporates output load, input edge slew, supply voltage, temperature, global process variations and local process variations all in the same model. We find that the standard polynomial models cannot handle such a large heterogeneous set of input variables. We instead use neural networks, which are well known for their ability to approximate any arbitrary continuous function. Our initial experiments with a small subset of standard cell gates of an industrial 65 nm library show promising results with error in mean less than 1%, error in standard deviation less than 3% and maximum error less than 11% as compared to SPICE for models covering 0.9- 1.1 V of supply, -40degC to 125degC of temperature, load, slew and global and local process parameters. Enhancing the conventional libraries to be voltage and temperature scalable with similar accuracy requires on an average 4x more SPICE characterization runs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We investigate the feasibility of developing a comprehensive gate delay and slew models which incorporates output load, input edge slew, supply voltage, temperature, global process variations and local process variations all in the same model. We find that the standard polynomial models cannot handle such a large heterogeneous set of input variables. We instead use neural networks, which are well known for their ability to approximate any arbitrary continuous function. Our initial experiments with a small subset of standard cell gates of an industrial 65 nm library show promising results with error in mean less than 1%, error in standard deviation less than 3% and maximum error less than 11% as compared to SPICE for models covering 0.9- 1.1 V of supply, -40degC to 125degC of temperature, load, slew and global and local process parameters. Enhancing the conventional libraries to be voltage and temperature scalable with similar accuracy requires on an average 4x more SPICE characterization runs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We use the HΙ scale height data along with the HΙ rotation curve as constraints to probe the shape and density profile of the dark matter halos of M31 (Andromeda) and the superthin, low surface brightness (LSB) galaxy UGC 07321. We model the galaxy as a two component system of gravitationally-coupled stars and gas subjected to the force field of a dark matter halo. For M31, we get a flattened halo which is required to match the outer galactic HΙ scale height data, with our best-fit axis ratio (0.4) lying at the most oblate end of the distributions obtained from cosmological simulations. For UGC 07321, our best-fit halo core radius is only slightly larger than the stellar disc scale length, indicating that the halo is important even at small radii in this LSB galaxy. The high value of the gas velocity dispersion required to match the scale height data can explain the low star-formation rate of this galaxy.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Seismic hazard and microzonation of cities enable to characterize the potential seismic areas that need to be taken into account when designing new structures or retrofitting the existing ones. Study of seismic hazard and preparation of geotechnical microzonation maps has been attempted using Geographical Information System (GIS). GIS will provide an effective solution for integrating different layers of information thus providing a useful input for city planning and in particular input to earthquake resistant design of structures in an area. Seismic hazard is the study of expected earthquake ground motions at any point on the earth. Microzonation is the process of sub division of region in to number of zones based on the earthquake effects in the local scale. Seismic microzonation is the process of estimating response of soil layers under earthquake excitation and thus the variation of ground motion characteristic on the ground surface. For the seismic microzonation, geotechnical site characterization need to be assessed at local scale (micro level), which is further used to assess of the site response and liquefaction susceptibility of the sites. Seismotectonic atlas of the area having a radius of 350km around Bangalore has been prepared with all the seismogenic sources and historic earthquake events (a catalogue of about 1400 events since 1906). We have attempted to carryout the site characterization of Bangalore by collating conventional geotechnical boreholes data (about 900 borehole data with depth) and integrated in GIS. 3-D subsurface model of Bangalore prepared using GIS is shown in Figure 1.Further, Shear wave velocity survey based on geophysical method at about 60 locations in the city has been carried out in 220 square Kms area. Site response and local site effects have been evaluated using 1-dimensional ground response analysis. Spatial variability of soil overburden depths, ground surface Peak Ground Acceleration’s(PGA), spectral acceleration for different frequencies, liquefaction susceptibility have been mapped in the 220 sq km area using GIS.ArcInfo software has been used for this purpose. These maps can be used for the city planning and risk & vulnerability studies. Figure 2 shows a map of peak ground acceleration at rock level for Bangalore city. Microtremor experiments were jointly carried out with NGRI scientists at about 55 locations in the city and the predominant frequency of the overburden soil columns were evaluated.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, reduced level of rock at Bangalore, India is arrived from the 652 boreholes data in the area covering 220 sq.km. In the context of prediction of reduced level of rock in the subsurface of Bangalore and to study the spatial variability of the rock depth, ordinary kriging and Support Vector Machine (SVM) models have been developed. In ordinary kriging, the knowledge of the semivariogram of the reduced level of rock from 652 points in Bangalore is used to predict the reduced level of rock at any point in the subsurface of Bangalore, where field measurements are not available. A cross validation (Q1 and Q2) analysis is also done for the developed ordinary kriging model. The SVM is a novel type of learning machine based on statistical learning theory, uses regression technique by introducing e-insensitive loss function has been used to predict the reduced level of rock from a large set of data. A comparison between ordinary kriging and SVM model demonstrates that the SVM is superior to ordinary kriging in predicting rock depth.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Energy plays a prominent role in human society. As a result of technological and industrial development,the demand for energy is rapidly increasing. Existing power sources that are mainly fossil fuel based are leaving an unacceptable legacy of waste and pollution apart from diminishing stock of fuels.Hence, the focus is now shifted to large-scale propagation of renewable energy. Renewable energy technologies are clean sources of energy that have a much lower environmental impact than conventional energy technologies. Solar energy is one such renewable energy. Most renewable energy comes either directly or indirectly from the sun. Estimation of solar energy potential of a region requires detailed solar radiation climatology, and it is necessary to collect extensive radiation data of high accuracy covering all climatic zones of the region. In this regard, a decision support system (DSS)would help in estimating solar energy potential considering the region’s energy requirement.This article explains the design and implementation of DSS for assessment of solar energy. The DSS with executive information systems and reporting tools helps to tap vast data resources and deliver information. The main hypothesis is that this tool can be used to form a core of practical methodology that will result in more resilient in time and can be used by decision-making bodies to assess various scenarios. It also offers means of entering, accessing, and interpreting the information for the purpose of sound decision making.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Lentic ecosystems vital functions such as recycling of nutrients, purification of water, recharge of groundwater,augmenting and maintenance of stream flow and habitat provision for a wide variety of flora and fauna along with their recreation values necessitates their sustainable management through appropriate conservation mechanisms. Failure to restore these ecosystems will result in extinction of species or ecosystem types and cause permanent ecological damage. In Bangalore, lentic ecosystems (for example lakes) have played a prominent role serving the needs of agriculture and drinking water. But the burgeoning population accompanied by unplanned developmental activities has led to the drastic reduction in their numbers (from 262 in 1976 to 81). The existing water bodies are contaminated by residential, agricultural, commercial and industrial wastes/effluents. In order to restore the ecosystem, assessment of the level of contamination is crucial. This paper focuses on characterisation and restoration aspects of Varthur lake based on hydrological, morphometric, physical-chemical and socio-economic investigations for a period of six months covering post monsoon seasons. The results of the water quality analysis show that the lake is eutrophic with high concentrations of phosphorous and organic matter. The morphometric analysis indicates that the lake is shallow in relation to its surface area. Socio-economic analyses show dependence of local residents for irrigation, fodder, etc. These analyses highlight the need and urgency to restore the physical, chemical and biological integrity through viable restoration and sustainable watershed management strategies, which include pollution abatement, catchment treatment, desilting of the lake and educating all stakeholders on the conservation and restoration of lake ecosystems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The influence of riparian land use on the diversity an~ distribution were investigated by sampling 113 localities covering 4 districts in south-western Karnataka. A total of 55 species in 12 families were recorded. Streams, rivers and lakes had higher diversity than marshes and sea coast. However, lakes had low endemism than streams and rivers. Streams flowing through evergreen forests had higher diversity and endemism. Human impacted riparian zones such as paddy fields had relatively lower species richness. However, streams flowing through forestry plantations had higher diversity than other natural riparian zones such as dry deciduous, moist deciduous and semi evergreen forests. Myristica swamps-a relict evergreen forest marsh had low diversity and high endemism. Odonate communities of lentic ecosystems, and human impacted streams and rivers were characterized by widespread generalist species. Endemics and habitat specialists were. restricted to streams and rivers with undisturbed riparian zone. The study documents possible odonate community change due to human impact: The influence of riparian 'Ianduse change on odonate community is also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The problem of intrusion detection and location identification in the presence of clutter is considered for a hexagonal sensor-node geometry. It is noted that in any practical application,for a given fixed intruder or clutter location, only a small number of neighboring sensor nodes will register a significant reading. Thus sensing may be regarded as a local phenomenon and performance is strongly dependent on the local geometry of the sensor nodes. We focus on the case when the sensor nodes form a hexagonal lattice. The optimality of the hexagonal lattice with respect to density of packing and covering and largeness of the kissing number suggest that this is the best possible arrangement from a sensor network viewpoint. The results presented here are clearly relevant when the particular sensing application permits a deterministic placement of sensors. The results also serve as a performance benchmark for the case of a random deployment of sensors. A novel feature of our analysis of the hexagonal sensor grid is a signal-space viewpoint which sheds light on achievable performance.Under this viewpoint, the problem of intruder detection is reduced to one of determining in a distributed manner, the optimal decision boundary that separates the signal spaces SI and SC associated to intruder and clutter respectively. Given the difficulty of implementing the optimal detector, we present a low-complexity distributive algorithm under which the surfaces SI and SC are separated by a wellchosen hyperplane. The algorithm is designed to be efficient in terms of communication cost by minimizing the expected number of bits transmitted by a sensor.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A "plan diagram" is a pictorial enumeration of the execution plan choices of a database query optimizer over the relational selectivity space. We have shown recently that, for industrial-strength database engines, these diagrams are often remarkably complex and dense, with a large number of plans covering the space. However, they can often be reduced to much simpler pictures, featuring significantly fewer plans, without materially affecting the query processing quality. Plan reduction has useful implications for the design and usage of query optimizers, including quantifying redundancy in the plan search space, enhancing useability of parametric query optimization, identifying error-resistant and least-expected-cost plans, and minimizing the overheads of multi-plan approaches. We investigate here the plan reduction issue from theoretical, statistical and empirical perspectives. Our analysis shows that optimal plan reduction, w.r.t. minimizing the number of plans, is an NP-hard problem in general, and remains so even for a storage-constrained variant. We then present a greedy reduction algorithm with tight and optimal performance guarantees, whose complexity scales linearly with the number of plans in the diagram for a given resolution. Next, we devise fast estimators for locating the best tradeoff between the reduction in plan cardinality and the impact on query processing quality. Finally, extensive experimentation with a suite of multi-dimensional TPCH-based query templates on industrial-strength optimizers demonstrates that complex plan diagrams easily reduce to "anorexic" (small absolute number of plans) levels incurring only marginal increases in the estimated query processing costs.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We recast the reconstruction problem of diffuse optical tomography (DOT) in a pseudo-dynamical framework and develop a method to recover the optical parameters using particle filters, i.e., stochastic filters based on Monte Carlo simulations. In particular, we have implemented two such filters, viz., the bootstrap (BS) filter and the Gaussian-sum (GS) filter and employed them to recover optical absorption coefficient distribution from both numerically simulated and experimentally generated photon fluence data. Using either indicator functions or compactly supported continuous kernels to represent the unknown property distribution within the inhomogeneous inclusions, we have drastically reduced the number of parameters to be recovered and thus brought the overall computation time to within reasonable limits. Even though the GS filter outperformed the BS filter in terms of accuracy of reconstruction, both gave fairly accurate recovery of the height, radius, and location of the inclusions. Since the present filtering algorithms do not use derivatives, we could demonstrate accurate contrast recovery even in the middle of the object where the usual deterministic algorithms perform poorly owing to the poor sensitivity of measurement of the parameters. Consistent with the fact that the DOT recovery, being ill posed, admits multiple solutions, both the filters gave solutions that were verified to be admissible by the closeness of the data computed through them to the data used in the filtering step (either numerically simulated or experimentally generated). (C) 2011 Optical Society of America

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work, an attempt has been made to evaluate the spatial variation of peak horizontal acceleration (PHA) and spectral acceleration (SA) values at rock level for south India based on the probabilistic seismic hazard analysis (PSHA). These values were estimated by considering the uncertainties involved in magnitude, hypocentral distance and attenuation of seismic waves. Different models were used for the hazard evaluation, and they were combined together using a logic tree approach. For evaluating the seismic hazard, the study area was divided into small grids of size 0.1A degrees A xA 0.1A degrees, and the hazard parameters were calculated at the centre of each of these grid cells by considering all the seismic sources within a radius of 300 km. Rock level PHA values and SA at 1 s corresponding to 10% probability of exceedance in 50 years were evaluated for all the grid points. Maps showing the spatial variation of rock level PHA values and SA at 1 s for the entire south India are presented in this paper. To compare the seismic hazard for some of the important cities, the seismic hazard curves and the uniform hazard response spectrum (UHRS) at rock level with 10% probability of exceedance in 50 years are also presented in this work.