9 resultados para National Assessment of Educational Progress (Project)

em Indian Institute of Science - Bangalore - Índia


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The demand for tunnelling and underground space creation is rapidly growing due to the requirement of civil infrastructure projects and urbanisation. Blasting remains the most inexpensive method of underground excavations in hard rock. Unfortunately, there are no specific safety guidelines available for the blasted tunnels with regards to the threshold limits of vibrations caused by repeated blasting activity in the close proximity. This paper presents the results of a comprehensive study conducted to find out the effect of repeated blast loading on the damage experienced by jointed basaltic rock mass during tunnelling works. Conducting of multiple rounds of blasts for various civil excavations in a railway tunnel imparted repeated loading on rock mass of sidewall and roof of the tunnel. The blast induced damage was assessed by using vibration attenuation equations of charge weight scaling law and measured by borehole extensometers and borehole camera. Ground vibrations of each blasting round were also monitored by triaxial geophones installed near the borehole extensometers. The peak particle velocity (V-max) observations and plastic deformations from borehole extensometers were used to develop a site specific damage model. The study reveals that repeated dynamic loading imparted on the exposed tunnel from subsequent blasts, in the vicinity, resulted in rock mass damage at lesser vibration levels than the critical peak particle velocity (V-cr). It was found that, the repeated blast loading resulted in the near-field damage due to high frequency waves and far-field damage due to low frequency waves. The far field damage, after 45-50 occurrences of blast loading, was up to 55% of the near-field damage in basaltic rock mass. The findings of the study clearly indicate that the phenomena of repeated blasting with respect to number of cycles of loading should be taken into consideration for proper assessment of blast induced damage in underground excavations.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crystals growing from solution, the vapour phase and from supercooled melt exhibit, as a rule, planar faces. The geometry and distribution of dislocations present within the crystals thus grown are strongly related to the growth on planar faces and to the different growth sectors rather than the physical properties of the crystals and the growth methods employed. As a result, many features of generation and geometrical arrangement of defects are common to extremely different crystal species. In this paper these commoner aspects of dislocation generation and configuration which permits one to predict their nature and distribution are discussed. For the purpose of imaging the defects a very versatile and widely applicable technique viz. x-ray diffraction topography is used. Growth dislocations in solution grown crystals follow straight path with strongly defined directions. These preferred directions which in most cases lie within an angle of ±15° to the growth normal depend on the growth direction and on the Burger's vector involved. The potential configuration of dislocations in the growing crystals can be evaluated using the theory developed by Klapper which is based on linear anisotropic elastic theory. The preferred line direction of a particular dislocation corresponds to that in which the dislocation energy per unit growth length is a minimum. The line direction analysis based on this theory enables one to characterise dislocations propagating in a growing crystal. A combined theoretical analysis and experimental investigation based on the above theory is presented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Downscaling to station-scale hydrologic variables from large-scale atmospheric variables simulated by general circulation models (GCMs) is usually necessary to assess the hydrologic impact of climate change. This work presents CRF-downscaling, a new probabilistic downscaling method that represents the daily precipitation sequence as a conditional random field (CRF). The conditional distribution of the precipitation sequence at a site, given the daily atmospheric (large-scale) variable sequence, is modeled as a linear chain CRF. CRFs do not make assumptions on independence of observations, which gives them flexibility in using high-dimensional feature vectors. Maximum likelihood parameter estimation for the model is performed using limited memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) optimization. Maximum a posteriori estimation is used to determine the most likely precipitation sequence for a given set of atmospheric input variables using the Viterbi algorithm. Direct classification of dry/wet days as well as precipitation amount is achieved within a single modeling framework. The model is used to project the future cumulative distribution function of precipitation. Uncertainty in precipitation prediction is addressed through a modified Viterbi algorithm that predicts the n most likely sequences. The model is applied for downscaling monsoon (June-September) daily precipitation at eight sites in the Mahanadi basin in Orissa, India, using the MIROC3.2 medium-resolution GCM. The predicted distributions at all sites show an increase in the number of wet days, and also an increase in wet day precipitation amounts. A comparison of current and future predicted probability density functions for daily precipitation shows a change in shape of the density function with decreasing probability of lower precipitation and increasing probability of higher precipitation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Nanotechnology is a new technology which is generating a lot of interest among academicians, practitioners and scientists. Critical research is being carried out in this area all over the world.Governments are creating policy initiatives to promote developments it the nanoscale science and technology developments. Private investment is also seeing a rising trend. Large number of academic institutions and national laboratories has set up research centers that are workingon the multiple applications of nanotechnology. Wide ranges of applications are claimed for nanotechnology. This consists of materials, chemicals, textiles, semiconductors, to wonder drug delivery systems and diagnostics. Nanotechnology is considered to be a next big wave of technology after information technology and biotechnology. In fact, nanotechnology holds the promise of advances that exceed those achieved in recent decades in computers and biotechnology. Much interest in nanotechnology also could be because of the fact that enormous monetary benefits are expected from nanotechnology based products. According to NSF, revenues from nanotechnology could touch $ 1 trillion by 2015. However much of the benefits are projected ones. Realizing claimed benefits require successful development of nanoscience andv nanotechnology research efforts. That is the journey of invention to innovation has to be completed. For this to happen the technology has to flow from laboratory to market. Nanoscience and nanotechnology research efforts have to come out in the form of new products, new processes, and new platforms.India has also started its Nanoscience and Nanotechnology development program in under its 10(th) Five Year Plan and funds worth Rs. One billion have been allocated for Nanoscience and Nanotechnology Research and Development. The aim of the paper is to assess Nanoscience and Nanotechnology initiatives in India. We propose a conceptual model derived from theresource based view of the innovation. We have developed a structured questionnaire to measure the constructs in the conceptual model. Responses have been collected from 115 scientists and engineers working in the field of Nanoscience and Nanotechnology. The responses have been analyzed further by using Principal Component Analysis, Cluster Analysis and Regression Analysis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a set of metrics that evaluate the uniformity, sharpness, continuity, noise, stroke width variance,pulse width ratio, transient pixels density, entropy and variance of components to quantify the quality of a document image. The measures are intended to be used in any optical character recognition (OCR) engine to a priori estimate the expected performance of the OCR. The suggested measures have been evaluated on many document images, which have different scripts. The quality of a document image is manually annotated by users to create a ground truth. The idea is to correlate the values of the measures with the user annotated data. If the measure calculated matches the annotated description,then the metric is accepted; else it is rejected. In the set of metrics proposed, some of them are accepted and the rest are rejected. We have defined metrics that are easily estimatable. The metrics proposed in this paper are based on the feedback of homely grown OCR engines for Indic (Tamil and Kannada) languages. The metrics are independent of the scripts, and depend only on the quality and age of the paper and the printing. Experiments and results for each proposed metric are discussed. Actual recognition of the printed text is not performed to evaluate the proposed metrics. Sometimes, a document image containing broken characters results in good document image as per the evaluated metrics, which is part of the unsolved challenges. The proposed measures work on gray scale document images and fail to provide reliable information on binarized document image.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The safety of an in-service brick arch railway bridge is assessed through field testing and finite-element analysis. Different loading test train configurations have been used in the field testing. The response of the bridge in terms of displacements, strains, and accelerations is measured under the ambient and design train traffic loading conditions. Nonlinear fracture mechanics-based finite-element analyses are performed to assess the margin of safety. A parametric study is done to study the effects of tensile strength on the progress of cracking in the arch. Furthermore, a stability analysis to assess collapse of the arch caused by lateral movement at the springing of one of the abutments that is elastically supported is carried out. The margin of safety with respect to cracking and stability failure is computed. Conclusions are drawn with some remarks on the state of the bridge within the framework of the information available and inferred information. DOI: 10.1061/(ASCE)BE.1943-5592.0000338. (C) 2013 American Society of Civil Engineers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Northeast India is one of the most highly seismically active regions in the world with more than seven earthquakes on an average per year of magnitude 5.0 and above. Reliable seismic hazard assessment could provide the necessary design inputs for earthquake resistant design of structures in this' region. In this study, deterministic as well as probabilistic methods have been attempted for seismic hazard assessment of Tripura and Mizoram states at bedrock level condition. An updated earthquake catalogue was collected from various national and international seismological agencies for the period from 1731 to 2011. The homogenization, declustering and data completeness analysis of events have been carried out before hazard evaluation. Seismicity parameters have been estimated using G R relationship for each source zone. Based on the seismicity, tectonic features and fault rupture mechanism, this region was divided into six major subzones. Region specific correlations were used for magnitude conversion for homogenization of earthquake size. Ground motion equations (Atkinson and Boore 2003; Gupta 2010) were validated with the observed PGA (peak ground acceleration) values before use in the hazard evaluation. In this study, the hazard is estimated using linear sources, identified in and around the study area. Results are presented in the form of PGA using both DSHA (deterministic seismic hazard analysis) and PSHA (probabilistic seismic hazard analysis) with 2 and 10% probability of exceedance in 50 years, and spectral acceleration (T = 0. 2 s, 1.0 s) for both the states (2% probability of exceedance in 50 years). The results are important to provide inputs for planning risk reduction strategies, for developing risk acceptance criteria and financial analysis for possible damages in the study area with a comprehensive analysis and higher resolution hazard mapping.