937 resultados para methodologies


Relevância:

10.00% 10.00%

Publicador:

Resumo:

ZnO is a unique material with numerous applications. There has been great interest in the synthesis of long nanowires to explore new technology coupled with length. But the quest to enhance the length is limited by various experimental shortcomings such as catalytic poisoning, degradation of the precursors and growth in all possible directions. This review article focuses on the growth of ultralong ZnO nanowires along with possible methodologies to overcome these limitations.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We address a physics-based solution of joule heating phenomenon in a single-layer graphene (SLG) sheet under the presence of Thomson effect. We demonstrate that the temperature in an isotopically pure (containing only C-12) SLG sheet attains its saturation level quicker than when doped with its isotopes (C-13). From the solution of the joule heating equation, we find that the thermal time constant of the SLG sheet is in the order of tenths of a nanosecond for SLG dimensions of a few micrometers. These results have been formulated using the electron interactions with the inplane and flexural phonons to demonstrate a field-dependent Landauer transmission coefficient. We further develop an analytical model of the SLG specific heat using the quadratic (out of plane) phonon band structure over the room temperature. Additionally, we show that a cooling effect in the SLG sheet can be substantially enhanced with the addition of C-13. The methodologies as discussed in this paper can be put forward to analyze the graphene heat spreader theory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Research has been undertaken to ascertain the predictability of non-stationary time series using wavelet and Empirical Mode Decomposition (EMD) based time series models. Methods have been developed in the past to decompose a time series into components. Forecasting of these components combined with random component could yield predictions. Using this ideology, wavelet and EMD analyses have been incorporated separately which decomposes a time series into independent orthogonal components with both time and frequency localizations. The component series are fit with specific auto-regressive models to obtain forecasts which are later combined to obtain the actual predictions. Four non-stationary streamflow sites (USGS data resources) of monthly total volumes and two non-stationary gridded rainfall sites (IMD) of monthly total rainfall are considered for the study. The predictability is checked for six and twelve months ahead forecasts across both the methodologies. Based on performance measures, it is observed that wavelet based method has better prediction capabilities over EMD based method despite some of the limitations of time series methods and the manner in which decomposition takes place. Finally, the study concludes that the wavelet based time series algorithm can be used to model events such as droughts with reasonable accuracy. Also, some modifications that can be made in the model have been discussed that could extend the scope of applicability to other areas in the field of hydrology. (C) 2013 Elesvier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dead-time is provided in between the gating signals of the top and bottom semiconductor switches in an inverter leg to prevent the shorting of DC bus. Due to this dead time, there is a significant unwanted change in the output voltage of the inverter. The effect is different for different pulse width modulation (PWM) methodologies. The effect of dead-time on the output fundamental voltage is studied theoretically as well as experimentally for bus-clamping PWM methodologies. Further, experimental observations on the effectiveness of dead-time compensation are presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we propose a framework for optimum steering input determination of all-wheel steer vehicles (AWSV) on rough terrains. The framework computes the steering input which minimizes the tracking error for a given trajectory. Unlike previous methodologies of computing steering inputs of car-like vehicles, the proposed methodology depends explicitly on the vehicle dynamics and can be extended to vehicle having arbitrary number of steering inputs. A fully generic framework has been used to derive the vehicle dynamics and a non-linear programming based constrained optimization approach has been used to compute the steering input considering the instantaneous vehicle dynamics, no-slip and contact constraints of the vehicle. All Wheel steer Vehicles have a special parallel steering ability where the instantaneous centre of rotation (ICR) is at infinity. The proposed framework automatically enables the vehicle to choose between parallel steer and normal operation depending on the error with respect to the desired trajectory. The efficacy of the proposed framework is proved by extensive uneven terrain simulations, for trajectories with continuous or discontinuous velocity profile.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The objective in this work is to develop downscaling methodologies to obtain a long time record of inundation extent at high spatial resolution based on the existing low spatial resolution results of the Global Inundation Extent from Multi-Satellites (GIEMS) dataset. In semiarid regions, high-spatial-resolution a priori information can be provided by visible and infrared observations from the Moderate Resolution Imaging Spectroradiometer (MODIS). The study concentrates on the Inner Niger Delta where MODIS-derived inundation extent has been estimated at a 500-m resolution. The space-time variability is first analyzed using a principal component analysis (PCA). This is particularly effective to understand the inundation variability, interpolate in time, or fill in missing values. Two innovative methods are developed (linear regression and matrix inversion) both based on the PCA representation. These GIEMS downscaling techniques have been calibrated using the 500-m MODIS data. The downscaled fields show the expected space-time behaviors from MODIS. A 20-yr dataset of the inundation extent at 500 m is derived from this analysis for the Inner Niger Delta. The methods are very general and may be applied to many basins and to other variables than inundation, provided enough a priori high-spatial-resolution information is available. The derived high-spatial-resolution dataset will be used in the framework of the Surface Water Ocean Topography (SWOT) mission to develop and test the instrument simulator as well as to select the calibration validation sites (with high space-time inundation variability). In addition, once SWOT observations are available, the downscaled methodology will be calibrated on them in order to downscale the GIEMS datasets and to extend the SWOT benefits back in time to 1993.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

With the development of deep sequencing methodologies, it has become important to construct site saturation mutant (SSM) libraries in which every nucleotide/codon in a gene is individually randomized. We describe methodologies for the rapid, efficient, and economical construction of such libraries using inverse polymerase chain reaction (PCR). We show that if the degenerate codon is in the middle of the mutagenic primer, there is an inherent PCR bias due to the thermodynamic mismatch penalty, which decreases the proportion of unique mutants. Introducing a nucleotide bias in the primer can alleviate the problem. Alternatively, if the degenerate codon is placed at the 5' end, there is no PCR bias, which results in a higher proportion of unique mutants. This also facilitates detection of deletion mutants resulting from errors during primer synthesis. This method can be used to rapidly generate SSM libraries for any gene or nucleotide sequence, which can subsequently be screened and analyzed by deep sequencing. (C) 2013 Elsevier Inc. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A variety of methods are available to estimate future solar radiation (SR) scenarios at spatial scales that are appropriate for local climate change impact assessment. However, there are no clear guidelines available in the literature to decide which methodologies are most suitable for different applications. Three methodologies to guide the estimation of SR are discussed in this study, namely: Case 1: SR is measured, Case 2: SR is measured but sparse and Case 3: SR is not measured. In Case 1, future SR scenarios are derived using several downscaling methodologies that transfer the simulated large-scale information of global climate models to a local scale ( measurements). In Case 2, the SR was first estimated at the local scale for a longer time period using sparse measured records, and then future scenarios were derived using several downscaling methodologies. In Case 3: the SR was first estimated at a regional scale for a longer time period using complete or sparse measured records of SR from which SR at the local scale was estimated. Finally, the future scenarios were derived using several downscaling methodologies. The lack of observed SR data, especially in developing countries, has hindered various climate change impact studies. Hence, this was further elaborated by applying the Case 3 methodology to a semi-arid Malaprabha reservoir catchment in southern India. A support vector machine was used in downscaling SR. Future monthly scenarios of SR were estimated from simulations of third-generation Canadian General Circulation Model (CGCM3) for various SRES emission scenarios (A1B, A2, B1, and COMMIT). Results indicated a projected decrease of 0.4 to 12.2 W m(-2) yr(-1) in SR during the period 2001-2100 across the 4 scenarios. SR was calculated using the modified Hargreaves method. The decreasing trends for the future were in agreement with the simulations of SR from the CGCM3 model directly obtained for the 4 scenarios.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A controlled laboratory experiment was carried out on forty Indian male college students for evaluating the effect of indoor thermal environment on occupants' response and thermal comfort. During experiment, indoor temperature varied from 21 degrees C to 33 degrees C, and the variables like relative humidity, airflow, air temperature and radiant temperature were recorded along with subject's physiological parameters (skin (T-sk) and oral temperature (T-c)) and subjective thermal sensation responses (TSV). From T-sk and T-c, body temperature (T-b) was evaluated. Subjective Thermal Sensation Vote (TSV) was recorded using ASHRAE 7-point scale. In PMV model, Fanger's T-sk equation was used to accommodate adaptive response. Step-wise regression analysis result showed T-b was better predictor of TSV than T-sk and T-c. Regional skin temperature response, suppressed sweating without dipping, lower sweating threshold temperature and higher cutaneous threshold for sweating were observed as thermal adaptive responses. These adaptive responses cannot be considered in PMV model. To incorporate subjective adaptive response, mean skin temperature (T-sk) is considered in dry heat loss calculation. Along with these, PMV-model and other two methodologies are adopted to calculate PMV values and results are compared. However, recent literature is limited to measure the sweat rate in Indians and consideration of constant Ersw in PMV model needs to be corrected. Using measured T-sk in PMV model (Method(1)), thermal comfort zone corresponding to 0.5 <= PMV <= 0.5 was evaluated as (22.46-25.41) degrees C with neutral temperature of 23.91 degrees C, similarly while using TSV response, wider comfort zone was estimated as (23.25-26.32) degrees C with neutral temperature at 24.83 degrees C, which was further increased to with TSV-PPDnew, relation. It was observed that PMV-model overestimated the actual thermal response. Interestingly, these subjects were found to be less sensitive to hot but more sensitive to cold. A new TSV-PPD relation (PPDnew) was obtained from the population distribution of TSV response with an asymmetric distribution of hot-cold thermal sensation response from Indians. The calculations of human thermal stress according to steady state energy balance models used on PMV model seem to be inadequate to evaluate human thermal sensation of Indians. Relevance to industry: The purpose of this paper is to estimate thermal comfort zone and optimum temperature for Indians. It also highlights that PMV model seems to be inadequate to evaluate subjective thermal perception in Indians. These results can be used in feedback control of HVAC systems in residential and industrial buildings. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Non-invasive 3D imaging in materials and medical research involves methodologies such as X-ray imaging, MRI, fluorescence and optical coherence tomography, NIR absorption imaging, etc., providing global morphological/density/absorption changes of the hidden components. However, molecular information of such buried materials has been elusive. In this article we demonstrate observation of molecular structural information of materials hidden/buried in depth using Raman scattering. Typically, Raman spectroscopic observations are made at fixed collection angles, such as, 906, 1356, and 1806, except in spatially offset Raman scattering (SORS) (only back scattering based collection of photons) and transmission techniques. Such specific collection angles restrict the observations of Raman signals either from or near the surface of the materials. Universal Multiple Angle Raman Spectroscopy (UMARS) presented here employs the principle of (a) penetration depth of photons and then diffuse propagation through non-absorbing media by multiple scattering and (b) detection of signals from all the observable angles.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The Computational Analysis of Novel Drug Opportunities (CANDO) platform (http://protinfo.org/cando) uses similarity of compound-proteome interaction signatures to infer homology of compound/drug behavior. We constructed interaction signatures for 3733 human ingestible compounds covering 48,278 protein structures mapping to 2030 indications based on basic science methodologies to predict and analyze protein structure, function, and interactions developed by us and others. Our signature comparison and ranking approach yielded benchmarking accuracies of 12-25% for 1439 indications with at least two approved compounds. We prospectively validated 49/82 `high value' predictions from nine studies covering seven indications, with comparable or better activity to existing drugs, which serve as novel repurposed therapeutics. Our approach may be generalized to compounds beyond those approved by the FDA, and can also consider mutations in protein structures to enable personalization. Our platform provides a holistic multiscale modeling framework of complex atomic, molecular, and physiological systems with broader applications in medicine and engineering.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Chiral auxiliaries are used for NMR spectroscopic study of enantiomers. Often the presence of impurities, severe overlap of peaks, excessive line broadening and complex multiplicity pattern restricts the chiral analysis using 1D H-1 NMR spectrum. There are few approaches to resolve the overlapped peaks. One approach is to use suitable chiral auxiliary, which induces large chemical shift difference between the discriminated peaks (Delta delta(R,S)) and minimize the overlap. Another direction of approach is to design appropriate NMR experiments to circumvent some of these problems, viz, enhancing spectral resolution, unravelling the superimposed spectra of enantiomers, and reduction of spectral complexity. Large number of NMR techniques, such as two dimensional selective F-1 decoupling, RES-TOCSY, multiple quantum detection, frequency selective homodecoupling, band selective homodecoupling, broadband homodecoupling, etc. have been reported for such a purpose. Many of these techniques have aided in chiral analysis for molecules of diverse functionality in the presence of chiral auxiliaries. The present review summarizes the recently reported NMR experimental methodologies, with a special emphasis on the work carried out in authors' laboratory.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A brief account of the basic principle and methodologies of MRI technique, right from its beginning, are outlined. The final pulse sequence used for MRI using Fourier Imaging (phase encoding), Echo-Planar Imaging (EPI) for detection of a whole plane in a single excitation and T-1 and T-2 contrast enhancement is explained. The various associated methods such as, MR-spectroscopy, flow measurement (MRI-angiography), Lung-imaging using hyperpolarized Xe-129 and He-3 and functional imaging (f-MRI) are described.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Ice volume estimates are crucial for assessing water reserves stored in glaciers. Due to its large glacier coverage, such estimates are of particular interest for the Himalayan-Karakoram (HK) region. In this study, different existing methodologies are used to estimate the ice reserves: three area-volume relations, one slope-dependent volume estimation method, and two ice-thickness distribution models are applied to a recent, detailed, and complete glacier inventory of the HK region, spanning over the period 2000-2010 and revealing an ice coverage of 40 775 km(2). An uncertainty and sensitivity assessment is performed to investigate the influence of the observed glacier area and important model parameters on the resulting total ice volume. Results of the two ice-thickness distribution models are validated with local ice-thickness measurements at six glaciers. The resulting ice volumes for the entire HK region range from 2955 to 4737 km(3), depending on the approach. This range is lower than most previous estimates. Results from the ice thickness distribution models and the slope-dependent thickness estimations agree well with measured local ice thicknesses. However, total volume estimates from area-related relations are larger than those from other approaches. The study provides evidence on the significant effect of the selected method on results and underlines the importance of a careful and critical evaluation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Large-scale estimates of the area of terrestrial surface waters have greatly improved over time, in particular through the development of multi-satellite methodologies, but the generally coarse spatial resolution (tens of kms) of global observations is still inadequate for many ecological applications. The goal of this study is to introduce a new, globally applicable downscaling method and to demonstrate its applicability to derive fine resolution results from coarse global inundation estimates. The downscaling procedure predicts the location of surface water cover with an inundation probability map that was generated by bagged derision trees using globally available topographic and hydrographic information from the SRTM-derived HydroSHEDS database and trained on the wetland extent of the GLC2000 global land cover map. We applied the downscaling technique to the Global Inundation Extent from Multi-Satellites (GIEMS) dataset to produce a new high-resolution inundation map at a pixel size of 15 arc-seconds, termed GIEMS-D15. GIEMS-D15 represents three states of land surface inundation extents: mean annual minimum (total area, 6.5 x 10(6) km(2)), mean annual maximum (12.1 x 10(6) km(2)), and long-term maximum (173 x 10(6) km(2)); the latter depicts the largest surface water area of any global map to date. While the accuracy of GIEMS-D15 reflects distribution errors introduced by the downscaling process as well as errors from the original satellite estimates, overall accuracy is good yet spatially variable. A comparison against regional wetland cover maps generated by independent observations shows that the results adequately represent large floodplains and wetlands. GIEMS-D15 offers a higher resolution delineation of inundated areas than previously available for the assessment of global freshwater resources and the study of large floodplain and wetland ecosystems. The technique of applying inundation probabilities also allows for coupling with coarse-scale hydro-climatological model simulations. (C) 2014 Elsevier Inc All rights reserved.