311 resultados para clipped over-run
Resumo:
We report large scale deposition of tapered zinc oxide (ZnO) nanorods on Si(100) substrate by using newly designed metal-organic complex of zinc (Zn) as the precursor, and microwave irradiation assisted chemical synthesis as a process. The coatings are uniform and high density ZnO nanorods (similar to 1.5 mu m length) grow over the entire area (625 mm(2)) of the substrate within 1-5 min of microwave irradiation. ZnO coatings obtained by solution phase deposition yield strong UV emission. Variation of the molecular structure/molecular weight of the precursors and surfactants influence the crystallinity, morphology, and optical properties of ZnO coatings. The precursors in addition with the surfactant and the solvent are widely used to obtain desired coating on any substrate. The growth mechanism and the schematics of the growth process of ZnO coatings on Si(100) are discussed. (c) 2013 Elsevier B.V. All rights reserved.
Resumo:
This study presents the synthesis, characterization, and kinetics of steam reforming of methane and water gas shift (WGS) reactions over highly active and coke resistant Zr0.93Ru0.05O2-delta. The catalyst showed high activity at low temperatures for both the reactions. For WGS reaction, 99% conversion of CO with 100% H-2 selectivity was observed below 290 degrees C. The detailed kinetic studies including influence of gas phase product species, effect of temperature and catalyst loading on the reaction rates have been investigated. For the reforming reaction, the rate of reaction is first order in CH4 concentration and independent of CO and H2O concentration. This indicates that the adsorptive dissociation of CH4 is the rate determining step. The catalyst also showed excellent coke resistance even under a stoichiometric steam/carbon ratio. A lack of CO methanation activity is an important finding of present study and this is attributed to the ionic nature of Ru species. The associative mechanism involving the surface formate as an intermediate was used to correlate experimental data. Copyright (C) 2013, Hydrogen Energy Publications, LLC. Published by Elsevier Ltd. All rights reserved.
Resumo:
We present new data on the strength of oceanic lithosphere along the Ninetyeast Ridge (NER) from two independent methods: spectral analysis (Bouguer coherence) using the fan wavelet transform technique, and spatial analysis (flexure inversion) with the convolution method. The two methods provide effective elastic thickness (T-e) patterns that broadly complement each other, and correlate well with known surface structures and regional-scale features. Furthermore, our study presents a new high resolution database on the Moho configuration, which obeys flexural isostasy, and exhibit regional correlations with the T-e variations. A continuous ridge structure with a much lower T-e value than that of normal oceanic lithosphere provides strong support for the hotspot theory. The derived T-e values vary over the northern (higher T-e similar to 10-20 km), central (anomalously low T-e similar to 0-5 km), and southern (low T-e similar to 5 km) segments of the NER. The lack of correlation of the T-e value with the progressive aging of the lithosphere implies differences in thermo-mechanical setting of the crust and underlying mantle in different parts of the NER, again indicating diversity in their evolution. The anomalously low T-e and deeper Moho (similar to 22 km) estimates of the central NER (between 0.5 degrees N and 17 degrees S) are attributed to the interaction of a hotspot with the Wharton spreading ridge that caused significant thermal rejuvenation and hence weakening of the lithosphere. The higher mechanical strength values in the northern NER (north of 0.5 degrees N) may support the idea of off-ridge emplacement and a relatively large plate motion at the time of volcanism. The low T-e and deeper Moho (similar to 22 km) estimates in the southern part (south of 17 degrees S) suggest that the lithosphere was weak and therefore younger at the time of volcanism, and this supports the idea that the southern NER was emplaced on the edge of the Indian plate. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
The first regional synthesis of long-term (back to similar to 25 years at some stations) primary data (from direct measurement) on aerosol optical depth from the ARFINET (network of aerosol observatories established under the Aerosol Radiative Forcing over India (ARFI) project of Indian Space Research Organization over Indian subcontinent) have revealed a statistically significant increasing trend with a significant seasonal variability. Examining the current values of turbidity coefficients with those reported similar to 50 years ago reveals the phenomenal nature of the increase in aerosol loading. Seasonally, the rate of increase is consistently high during the dry months (December to March) over the entire region whereas the trends are rather inconsistent and weak during the premonsoon (April to May) and summer monsoon period (June to September). The trends in the spectral variation of aerosol optical depth (AOD) reveal the significance of anthropogenic activities on the increasing trend in AOD. Examining these with climate variables such as seasonal and regional rainfall, it is seen that the dry season depicts a decreasing trend in the total number of rainy days over the Indian region. The insignificant trend in AOD observed over the Indo-Gangetic Plain, a regional hot spot of aerosols, during the premonsoon and summer monsoon season is mainly attributed to the competing effects of dust transport and wet removal of aerosols by the monsoon rain. Contributions of different aerosol chemical species to the total dust, simulated using Goddard Chemistry Aerosol Radiation and Transport model over the ARFINET stations, showed an increasing trend for all the anthropogenic components and a decreasing trend for dust, consistent with the inference deduced from trend in Angstrom exponent.
Resumo:
[1] Evaporative fraction (EF) is a measure of the amount of available energy at the earth surface that is partitioned into latent heat flux. The currently operational thermal sensors like the Moderate Resolution Imaging Spectroradiometer (MODIS) on satellite platforms provide data only at 1000 m, which constraints the spatial resolution of EF estimates. A simple model (disaggregation of evaporative fraction (DEFrac)) based on the observed relationship between EF and the normalized difference vegetation index is proposed to spatially disaggregate EF. The DEFrac model was tested with EF estimated from the triangle method using 113 clear sky data sets from the MODIS sensor aboard Terra and Aqua satellites. Validation was done using the data at four micrometeorological tower sites across varied agro-climatic zones possessing different land cover conditions in India using Bowen ratio energy balance method. The root-mean-square error (RMSE) of EF estimated at 1000 m resolution using the triangle method was 0.09 for all the four sites put together. The RMSE of DEFrac disaggregated EF was 0.09 for 250 m resolution. Two models of input disaggregation were also tried with thermal data sharpened using two thermal sharpening models DisTrad and TsHARP. The RMSE of disaggregated EF was 0.14 for both the input disaggregation models for 250 m resolution. Moreover, spatial analysis of disaggregation was performed using Landsat-7 (Enhanced Thematic Mapper) ETM+ data over four grids in India for contrasted seasons. It was observed that the DEFrac model performed better than the input disaggregation models under cropped conditions while they were marginally similar under non-cropped conditions.
Resumo:
We model communication of bursty sources: 1) over multiaccess channels, with either independent decoding or joint decoding and 2) over degraded broadcast channels, by a discrete-time multiclass processor sharing queue. We utilize error exponents to give a characterization of the processor sharing queue. We analyze the processor sharing queue model for the stable region of message arrival rates, and show the existence of scheduling policies for which the stability region converges to the information-theoretic capacity region in an appropriate limiting sense.
Resumo:
The objective in this work is to develop downscaling methodologies to obtain a long time record of inundation extent at high spatial resolution based on the existing low spatial resolution results of the Global Inundation Extent from Multi-Satellites (GIEMS) dataset. In semiarid regions, high-spatial-resolution a priori information can be provided by visible and infrared observations from the Moderate Resolution Imaging Spectroradiometer (MODIS). The study concentrates on the Inner Niger Delta where MODIS-derived inundation extent has been estimated at a 500-m resolution. The space-time variability is first analyzed using a principal component analysis (PCA). This is particularly effective to understand the inundation variability, interpolate in time, or fill in missing values. Two innovative methods are developed (linear regression and matrix inversion) both based on the PCA representation. These GIEMS downscaling techniques have been calibrated using the 500-m MODIS data. The downscaled fields show the expected space-time behaviors from MODIS. A 20-yr dataset of the inundation extent at 500 m is derived from this analysis for the Inner Niger Delta. The methods are very general and may be applied to many basins and to other variables than inundation, provided enough a priori high-spatial-resolution information is available. The derived high-spatial-resolution dataset will be used in the framework of the Surface Water Ocean Topography (SWOT) mission to develop and test the instrument simulator as well as to select the calibration validation sites (with high space-time inundation variability). In addition, once SWOT observations are available, the downscaled methodology will be calibrated on them in order to downscale the GIEMS datasets and to extend the SWOT benefits back in time to 1993.
Resumo:
The Large Hadron Collider (LHC) has completed its run at 8 TeV with the experiments ATLAS and CMS having collected about 25 fb(-1) of data each. Discovery of a light Higgs boson coupled with lack of evidence for supersymmetry at the LHC so far, has motivated studies of supersymmetry in the context of naturalness with the principal focus being the third generation squarks. In this work, we analyze the prospects of the flavor violating decay mode (t) over tilde (1) -> c chi(0)(1) at 8 and 13 TeV center-of-mass energy at the LHC. This channel is also relevant in the dark matter context for the stop-coannihilation scenario, where the relic density depends on the mass difference between the lighter stop quark ((t) over tilde (1)) and the lightest neutralino (chi(0)(1)) states. This channel is extremely challenging to probe, especially for situations when the mass difference between the lighter stop quark and the lightest neutralino is small. Using certain kinematical properties of signal events we find that the level of backgrounds can be reduced substantially. We find that the prospect for this channel is limited due to the low production cross section for top squarks and limited luminosity at 8 TeV, but at the 13 TeV LHC with 100 fb(-1) luminosity, it is possible to probe top squarks with masses up to similar to 450 GeV. We also discuss how the sensitivity could be significantly improved by tagging charm jets.
Resumo:
Elasticity in cloud systems provides the flexibility to acquire and relinquish computing resources on demand. However, in current virtualized systems resource allocation is mostly static. Resources are allocated during VM instantiation and any change in workload leading to significant increase or decrease in resources is handled by VM migration. Hence, cloud users tend to characterize their workloads at a coarse grained level which potentially leads to under-utilized VM resources or under performing application. A more flexible and adaptive resource allocation mechanism would benefit variable workloads, such as those characterized by web servers. In this paper, we present an elastic resources framework for IaaS cloud layer that addresses this need. The framework provisions for application workload forecasting engine, that predicts at run-time the expected demand, which is input to the resource manager to modulate resource allocation based on the predicted demand. Based on the prediction errors, resources can be over-allocated or under-allocated as compared to the actual demand made by the application. Over-allocation leads to unused resources and under allocation could cause under performance. To strike a good trade-off between over-allocation and under-performance we derive an excess cost model. In this model excess resources allocated are captured as over-allocation cost and under-allocation is captured as a penalty cost for violating application service level agreement (SLA). Confidence interval for predicted workload is used to minimize this excess cost with minimal effect on SLA violations. An example case-study for an academic institute web server workload is presented. Using the confidence interval to minimize excess cost, we achieve significant reduction in resource allocation requirement while restricting application SLA violations to below 2-3%.
Resumo:
In this paper, we extend the characterization of Zx]/(f), where f is an element of Zx] to be a free Z-module to multivariate polynomial rings over any commutative Noetherian ring, A. The characterization allows us to extend the Grobner basis method of computing a k-vector space basis of residue class polynomial rings over a field k (Macaulay-Buchberger Basis Theorem) to rings, i.e. Ax(1), ... , x(n)]/a, where a subset of Ax(1), ... , x(n)] is an ideal. We give some insights into the characterization for two special cases, when A = Z and A = ktheta(1), ... , theta(m)]. As an application of this characterization, we show that the concept of Border bases can be extended to rings when the corresponding residue class ring is a finitely generated, free A-module. (C) 2014 Elsevier B.V. All rights reserved.