7 resultados para Data Interpretation

em Indian Institute of Science - Bangalore - Índia


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Different seismic hazard components pertaining to Bangalore city,namely soil overburden thickness, effective shear-wave velocity, factor of safety against liquefaction potential, peak ground acceleration at the seismic bedrock, site response in terms of amplification factor, and the predominant frequency, has been individually evaluated. The overburden thickness distribution, predominantly in the range of 5-10 m in the city, has been estimated through a sub-surface model from geotechnical bore-log data. The effective shear-wave velocity distribution, established through Multi-channel Analysis of Surface Wave (MASW) survey and subsequent data interpretation through dispersion analysis, exhibits site class D (180-360 m/s), site class C (360-760 m/s), and site class B (760-1500 m/s) in compliance to the National Earthquake Hazard Reduction Program (NEHRP) nomenclature. The peak ground acceleration has been estimated through deterministic approach, based on the maximum credible earthquake of M-W = 5.1 assumed to be nucleating from the closest active seismic source (Mandya-Channapatna-Bangalore Lineament). The 1-D site response factor, computed at each borehole through geotechnical analysis across the study region, is seen to be ranging from around amplification of one to as high as four times. Correspondingly, the predominant frequency estimated from the Fourier spectrum is found to be predominantly in range of 3.5-5.0 Hz. The soil liquefaction hazard assessment has been estimated in terms of factor of safety against liquefaction potential using standard penetration test data and the underlying soil properties that indicates 90% of the study region to be non-liquefiable. The spatial distributions of the different hazard entities are placed on a GIS platform and subsequently, integrated through analytical hierarchal process. The accomplished deterministic hazard map shows high hazard coverage in the western areas. The microzonation, thus, achieved is envisaged as a first-cut assessment of the site specific hazard in laying out a framework for higher order seismic microzonation as well as a useful decision support tool in overall land-use planning, and hazard management. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Ultrasonic C-Scan is used very often to detect flaws and defects in the composite components resulted during fabrication and damages resulting from service conditions. Evaluation and characterization of defects and damages of composites require experience and good understanding of the material as they are distinctly different in composition and behavior as compared to conventional metallic materials. The failure mechanisms in composite materials are quite complex. They involve the interaction of matrix cracking, fiber matrix interface debonding, fiber pullout, fiber fracture and delamination. Generally all of them occur making the stress and failure analysis very complex. Under low-velocity impact loading delamination is observed to be a major failure mode. In composite materials the ultrasonic waves suffer high acoustic attenuation and scattering effect, thus making data interpretation difficult. However these difficulties can be overcome to a greater extent by proper selection of probe, probe parameter settings like pulse width, pulse amplitude, pulse repetition rate, delay, blanking, gain etc., and data processing which includes image processing done on the image obtained by the C-Scan.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The determination of the overconsolidation ratio (OCR) of clay deposits is an important task in geotechnical engineering practice. This paper examines the potential of a support vector machine (SVM) for predicting the OCR of clays from piezocone penetration test data. SVM is a statistical learning theory based on a structural risk minimization principle that minimizes both error and weight terms. The five input variables used for the SVM model for prediction of OCR are the corrected cone resistance (qt), vertical total stress (sigmav), hydrostatic pore pressure (u0), pore pressure at the cone tip (u1), and the pore pressure just above the cone base (u2). Sensitivity analysis has been performed to investigate the relative importance of each of the input parameters. From the sensitivity analysis, it is clear that qt=primary in situ data influenced by OCR followed by sigmav, u0, u2, and u1. Comparison between SVM and some of the traditional interpretation methods is also presented. The results of this study have shown that the SVM approach has the potential to be a practical tool for determination of OCR.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Image fusion is a formal framework which is expressed as means and tools for the alliance of multisensor, multitemporal, and multiresolution data. Multisource data vary in spectral, spatial and temporal resolutions necessitating advanced analytical or numerical techniques for enhanced interpretation capabilities. This paper reviews seven pixel based image fusion techniques - intensity-hue-saturation, brovey, high pass filter (HPF), high pass modulation (HPM), principal component analysis, fourier transform and correspondence analysis.Validation of these techniques on IKONOS data (Panchromatic band at I m spatial resolution and Multispectral 4 bands at 4 in spatial resolution) reveal that HPF and HPM methods synthesises the images closest to those the corresponding multisensors would observe at the high resolution level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The conformational dependence of interproton distances in model proline peptides has been investigated in order to facilitate interpretation of the results of Nuclear Overhauser Effect (NOE) studies on such peptides. For this purpose two model systems, namely, Ac-Pro-NHMe and Ac-Pro-X-NHMe have been chosen and used. In the former, short interproton distances detectable in NOE experiments permit a clear distinction between conformations with Pro ψ = -300 (helical region) and those in which ψ is around 1200 (polyproline region). For the latter, the variation of distances between the protons of methyl amide and the Pro ring have been studied by superimposing on the Ramachandran map in the (φ3, ψ3) plane. The results show that β-turns and non-β-turn conformations can be readily distinguished from NOE data and such long range NOEs should be detectable for specific non-β-turn conformations. NOEs involving Cβ and Cγ protons are particularly sensitive to the state of pyrrolidine ring puckering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes techniques to estimate the worst case execution time of executable code on architectures with data caches. The underlying mechanism is Abstract Interpretation, which is used for the dual purposes of tracking address computations and cache behavior. A simultaneous numeric and pointer analysis using an abstraction for discrete sets of values computes safe approximations of access addresses which are then used to predict cache behavior using Must Analysis. A heuristic is also proposed which generates likely worst case estimates. It can be used in soft real time systems and also for reasoning about the tightness of the safe estimate. The analysis methods can handle programs with non-affine access patterns, for which conventional Presburger Arithmetic formulations or Cache Miss Equations do not apply. The precision of the estimates is user-controlled and can be traded off against analysis time. Executables are analyzed directly, which, apart from enhancing precision, renders the method language independent.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We propose a new approach for producing precise constrained slices of programs in a language such as C. We build upon a previous approach for this problem, which is based on term-rewriting, which primarily targets loop-free fragments and is fully precise in this setting. We incorporate abstract interpretation into term-rewriting, using a given arbitrary abstract lattice, resulting in a novel technique for slicing loops whose precision is linked to the power of the given abstract lattice. We address pointers in a first-class manner, including when they are used within loops to traverse and update recursive data structures. Finally, we illustrate the comparative precision of our slices over those of previous approaches using representative examples.