61 resultados para Radar technology
Resumo:
Fiber Bragg grating (FBG) and Long Period Grating (LPG) chemical sensors are one of the most exciting developments in the field of optical fiber sensors. In this paper we have proposed a simple and effective chemical sensor based on FBG and LPG techniques for detecting the traces of cadmium (Cd) in drinking water at ppm level. The sensitiveness of these two has been compared. Also, these results have been compared with the results obtained by sophisticated spectroscopic atomic absorption and emission spectrometer instruments. For proper designing of FBG to act as a concentration sensor, the cladding region of the grating has been etched using HF solution. We have characterized the FBG concentration sensor sensitivities for different solutions of Cd concentrations varying from 0.01 ppm to 0.04 ppm and observed reflected spectrum in FBG and transmitted spectrum in LPG using Optical Spectrum Analyzer. Proper reagents have been used in the solutions for detection of the Cd species. The overall shift in wavelength is 10 nm in case of LPG and the shift of Bragg wavelength is 0.07 nm in case of FBG for 0.01-0.04 ppm concentrations. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This paper discusses the use of Jason-2 radar altimeter measurements to estimate the Ganga-Brahmaputra surface freshwater flux into the Bay of Bengal for the period mid-2008 to December 2011. A previous estimate was generated for 1993-2008 using TOPEX-Poseidon, ERS-2 and ENVISAT, and is now extended using Jason-2. To take full advantages of the new availability of in situ rating curves, the processing scheme is adapted and the adjustments of the methodology are discussed here. First, using a large sample of in situ river height measurements, we estimate the standard error of Jason-2-derived water levels over the Ganga and the Brahmaputra to be respectively of 0.28 m and 0.19 m, or less than similar to 4% of the annual peak-to-peak variations of these two rivers. Using the in situ rating curves between water levels and river discharges, we show that Jason-2 accurately infers Ganga and Brahmaputra instantaneous discharges for 2008-2011 with mean errors ranging from similar to 2180 m(3)/s (6.5%) over the Brahmaputra to similar to 1458 m(3)/s (13%) over the Ganga. The combined Ganga-Brahmaputra monthly discharges meet the requirements of acceptable accuracy (15-20%) with a mean error of similar to 16% for 2009-2011 and similar to 17% for 1993-2011. The Ganga-Brahmaputra monthly discharge at the river mouths is then presented, showing a marked interannual variability with a standard deviation of similar to 12500 m(3)/s, much larger than the data set uncertainty. Finally, using in situ sea surface salinity observations, we illustrate the possible impact of extreme continental freshwater discharge event on the northern Bay of Bengal as observed in 2008.
Resumo:
The basic framework and - conceptual understanding of the metallurgy of Ti alloys is strong and this has enabled the use of titanium and its alloys in safety-critical structures such as those in aircraft and aircraft engines. Nevertheless, a focus on cost-effectiveness and the compression of product development time by effectively integrating design with manufacturing in these applications, as well as those emerging in bioengineering, has driven research in recent decades towards a greater predictive capability through the use of computational materials engineering tools. Therefore this paper focuses on the complexity and variety of fundamental phenomena in this material system with a focus on phase transformations and mechanical behaviour in order to delineate the challenges that lie ahead in achieving these goals. (C) 2012 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
Resumo:
With the rapid scaling down of the semiconductor process technology, the process variation aware circuit design has become essential today. Several statistical models have been proposed to deal with the process variation. We propose an accurate BSIM model for handling variability in 45nm CMOS technology. The MOSFET is designed to meet the specification of low standby power technology of International Technology Roadmap for Semiconductors (ITRS).The process parameters variation of annealing temperature, oxide thickness, halo dose and title angle of halo implant are considered for the model development. One parameter variation at a time is considered for developing the model. The model validation is done by performance matching with device simulation results and reported error is less than 10%.© (2012) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.
Resumo:
Chronic recording of neural signals is indispensable in designing efficient brain machine interfaces and in elucidating human neurophysiology. The advent of multichannel microelectrode arrays has driven the need for electronics to record neural signals from many neurons. The dynamic range of the system is limited by background system noise which varies over time. We propose a neural amplifier in UMC 130 nm, 2P8M CMOS technology. It can be biased adaptively from 200 nA to 2 uA, modulating input referred noise from 9.92 uV to 3.9 uV. We also describe a low noise design technique which minimizes the noise contribution of the load circuitry. The amplifier can pass signal from 5 Hz to 7 kHz while rejecting input DC offsets at electrode-electrolyte interface. The bandwidth of the amplifier can be tuned by the pseudo-resistor for selectively recording low field potentials (LFP) or extra cellular action potentials (EAP). The amplifier achieves a mid-band voltage gain of 37 dB and minimizes the attenuation of the signal from neuron to the gate of the input transistor. It is used in fully differential configuration to reject noise of bias circuitry and to achieve high PSRR.
Resumo:
We report novel resistor grid network based space cloth for application in single and multi layer radar absorbers. The space cloth is analyzed and relations are derived for the sheet resistance in terms of the resistor in the grid network. Design curves are drawn using MATLAB and the space cloth is analyzed using HFSS™ software in a Salisbury screen for S, C and X bands. Next, prediction and simulation results for a three layer Jaumann absorber using square grid resistor network with a Radar Cross Section Reduction (RCSR) of -15 dB over C, X and Ku bands is reported. The simulation results are encouraging and have led to the fabrication of prototype broadband radar absorber and experimental work is under progress.
Resumo:
It is well accepted that technology plays a critical role in socio-technical transitions, and sustainable development pathways. A society‘s amenability to the intervening (sustainable) technology is fundamental to permit these transitions. The current age is at a juncture wherein technological advancements and capacities provide the common individual with affordable and unlimited choice. Technological advancement and complexity can either remain simple and unseen to the user or may daunt him to keep away, in which case the intended pathways remain unexploited. The current paper explores the reasons behind rejection of technology and proposes a solution model to address these factors in accommodating socio-technical transitions. The paper begins with structuring the societal levels at which technological rejection occurs and proceeds to discuss technology rejection at the individual user (niche)level. The factors influencing decisions regarding technology rejection are identified and discussed with particular relevance to the progressive world (Asia).
Resumo:
Empirical research available on technology transfer initiatives is either North American or European. Literature over the last two decades shows various research objectives such as identifying the variables to be measured and statistical methods to be used in the context of studying university based technology transfer initiatives. AUTM survey data from years 1996 to 2008 provides insightful patterns about the North American technology transfer initiatives, we use this data in our paper. This paper has three sections namely, a comparison of North American Universities with (n=1129) and without Medical Schools (n=786), an analysis of the top 75th percentile of these samples and a DEA analysis of these samples. We use 20 variables. Researchers have attempted to classify university based technology transfer initiative variables into multi-stages, namely, disclosures, patents and license agreements. Using the same approach, however with minor variations, three stages are defined in this paper. The first stage is to do with inputs from R&D expenditure and outputs namely, invention disclosures. The second stage is to do with invention disclosures being the input and patents issued being the output. The third stage is to do with patents issued as an input and technology transfers as outcomes.
Resumo:
Wavelength-division multiplexing (WDM) technology, by which multiple optical channels can be simultaneously transmitted at different wavelengths through a single optical fiber, is a useful means of making full use of the low-loss characteristics of optical fibers over a wide-wavelength region. The present day multifunction RADARs with multiple transmit receive modules requires various kinds of signal distribution for real time operation. If the signal distribution can be achieved through optical networks by using Wavelength Division Multiplexing (WDM) methods, it results in a distribution scheme with less hardware complexity and leads to the reduction in the weight of the antenna arrays In addition, being an Optical network it is free from Electromagnetic interference which is a crucial requirement in an array environment. This paper discusses about the analysis performed on various WDM components of distribution optical network for radar applications. The analysis is performed by considering the feasible constant gain regions of Erbium doped fiber amplifier (EDFA) in Matlab environment. This will help the user in the selection of suitable components for WDM based optical distribution networks.
Resumo:
Flood is one of the detrimental hydro-meteorological threats to mankind. This compels very efficient flood assessment models. In this paper, we propose remote sensing based flood assessment using Synthetic Aperture Radar (SAR) image because of its imperviousness to unfavourable weather conditions. However, they suffer from the speckle noise. Hence, the processing of SAR image is applied in two stages: speckle removal filters and image segmentation methods for flood mapping. The speckle noise has been reduced with the help of Lee, Frost and Gamma MAP filters. A performance comparison of these speckle removal filters is presented. From the results obtained, we deduce that the Gamma MAP is reliable. The selected Gamma MAP filtered image is segmented using Gray Level Co-occurrence Matrix (GLCM) and Mean Shift Segmentation (MSS). The GLCM is a texture analysis method that separates the image pixels into water and non-water groups based on their spectral feature whereas MSS is a gradient ascent method, here segmentation is carried out using spectral and spatial information. As test case, Kosi river flood is considered in our study. From the segmentation result of both these methods are comprehensively analysed and concluded that the MSS is efficient for flood mapping.
Resumo:
Comparison of reflectivity data of radars onboard CloudSat and TRMM is performed using coincident overpasses. The contoured frequency by altitude diagrams (CFADs) are constructed for two cases: (a) only include collocated vertical profiles that are most likely to be raining and (b) include all collocated profiles along with cloudy pixels falling within a distance of about 50 km from the centre point of coincidence. Our analysis shows that for both cases, CloudSat underestimates the radar reflectivity by about 10 dBZ compared to that of TRMM radar below 15 km altitude. The difference is well outside the uncertainty value of similar to 2 dBZ of each radar. Further, CloudSat reflectivity shows a decreasing trend while that of TRMM radar an increasing trend below 4 km height. Basically W-band radar that CloudSat flies suffers strong attenuation in precipitating clouds and its reflectivity value rarely exceeds 20 dBZ though its technical specification indicates the upper measurement limit to be 40 dBZ. TRMM radar, on the other hand, cannot measure values below 17 dBZ. In fact combining data from these two radars seems to give a better overall spatial structure of convective clouds.
Resumo:
Identification and mapping of crevasses in glaciated regions is important for safe movement. However, the remote and rugged glacial terrain in the Himalaya poses greater challenges for field data collection. In the present study crevasse signatures were collected from Siachen and Samudra Tapu glaciers in the Indian Himalaya using ground-penetrating radar (GPR). The surveys were conducted using the antennas of 250 MHz frequency in ground mode and 350 MHz in airborne mode. The identified signatures of open and hidden crevasses in GPR profiles collected in ground mode were validated by ground truthing. The crevasse zones and buried boulder areas in a glacier were identified using a combination of airborne GPR profiles and SAR data, and the same have been validated with the high-resolution optical satellite imagery (Cartosat-1) and Survey of India mapsheet. Using multi-sensor data, a crevasse map for Samudra Tapu glacier was prepared. The present methodology can also be used for mapping the crevasse zones in other glaciers in the Himalaya.
Resumo:
There are multiple goals of a technology transfer office (TTO) based in a university system. Whilst commercialization is a critical goal, maintenance and cleaning of the TTO's database needs detailing. Literature in the area is scarce and only some researchers make reference to TTO data cleaning. During an attempt to understand the commercial strategy of a university TTO in Bangalore the challenge of data cleaning was encountered. This paper describes a case study of data cleaning at an Indian university based TTO. 382 patent records were analyzed in the study. The case study first describes the back ground of the university system. Second, the method to clean the data and the experiences encountered are highlighted. Insights drawn indicate that patent data cleaning in a TTO is a specialized area which needs attention. Overlooking this activity can have legal implications and may result in an inability to commercialize the patent. Two levels of patent data cleaning are discussed in this case study. Best practices of data cleaning in academic TTOs are discussed.
Resumo:
Understanding technology evolution through periodic landscaping is an important stage of strategic planning in R&D Management. In fields like that of healthcare, where the initial R&D investment is huge and good medical product serve patients better, these activities become crucial. Approximately five percentage of the world population has hearing disabilities. Current hearing aid products meet less than ten percent of the global needs. Patent data and classifications on cochlear implants from 1977-2010, show the landscapes and evolution in the area of such implant. We attempt to highlight emergence and disappearance of patent classes over period of time showing variations in cochlear implant technologies. A network analysis technique is used to explore and capture technology evolution in patent classes showing what emerged or disappeared over time. Dominant classes are identified. The sporadic influence of university research in cochlear implants is also discussed.