54 resultados para Correction de textures


Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a statistical downscaling model, it is important to remove the bias of General Circulations Model (GCM) outputs resulting from various assumptions about the geophysical processes. One conventional method for correcting such bias is standardisation, which is used prior to statistical downscaling to reduce systematic bias in the mean and variances of GCM predictors relative to the observations or National Centre for Environmental Prediction/ National Centre for Atmospheric Research (NCEP/NCAR) reanalysis data. A major drawback of standardisation is that it may reduce the bias in the mean and variance of the predictor variable but it is much harder to accommodate the bias in large-scale patterns of atmospheric circulation in GCMs (e.g. shifts in the dominant storm track relative to observed data) or unrealistic inter-variable relationships. While predicting hydrologic scenarios, such uncorrected bias should be taken care of; otherwise it will propagate in the computations for subsequent years. A statistical method based on equi-probability transformation is applied in this study after downscaling, to remove the bias from the predicted hydrologic variable relative to the observed hydrologic variable for a baseline period. The model is applied in prediction of monsoon stream flow of Mahanadi River in India, from GCM generated large scale climatological data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present investigation, basic studies were conducted using Inclined pin-on-plate sliding Tester to understand the role of surface texture of hard material against soft materials during sliding. Soft materials such as Al-Mg alloy, pure Al and pure Mg were used as pins and 080 M40 steel was used as plate in the tests. Two surface parameters of steel plates — roughness and texture — were varied in tests. It was observed that the transfer layer formation and the coefficient of friction which has two components, namely adhesion and plowing component, are controlled by the surface texture of harder material. For the case of Al-Mg alloy, stick-slip phenomenon was absent under both dry and lubricated conditions. However, for the case of Al, it was observed only under lubricated conditions while for the case of Mg, it was observed under both dry and lubricated conditions. Further, it was observed that the amplitude of stick-slip motion primarily depends on plowing component of friction. The plowing component of friction was highest for the surface that promotes plane strain conditions near the surface and was lowest for the surface that promotes plane stress conditions near the surface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present investigation, soft materials, such as Al-4Mg alloy, high-purity Al and pure Mg pins were slid against hard steel plates of various surface textures to study the response of materials during sliding. The experiments were conducted using an inclined pin-on-plate sliding apparatus under both dry and lubricated conditions in an ambient environment. Two kinds of frictional response, namely steady-state and stick-slip, were observed during sliding. In general, the response was dependent on material pair, normal load, lubrication, and surface texture of the harder material. More specifically, for the case of Al-4Mg alloy, the stick-slip response was absent under both dry and lubricated conditions. For Al, stick-slip was observed only under lubricated conditions. For the case of Mg, the stick-slip response was seen under both dry and lubricated conditions. Further, it was observed that the amplitude of stick-slip motion primarily depends on the plowing component of friction. The plowing component of friction was the highest for the surfaces that promoted plane strain conditions and was the lowest for the surfaces that promoted plane stress conditions near the surface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the context of SPH-based simulations of impact dynamics, an optimised and automated form of the acceleration correction algorithm (Shaw and Reid, 2009a) is developed so as to remove spurious high frequency oscillations in computed responses whilst retaining the stabilizing characteristics of the artificial viscosity in the presence of shocks and layers with sharp gradients. A rational framework for an insightful characterisation of the erstwhile acceleration correction method is first set up. This is followed by the proposal of an optimised version of the method, wherein the strength of the correction term in the momentum balance and energy equations is optimised. For the first time, this leads to an automated procedure to arrive at the artificial viscosity term. In particular, this is achieved by taking a spatially varying response-dependent support size for the kernel function through which the correction term is computed. The optimum value of the support size is deduced by minimising the (spatially localised) total variation of the high oscillation in the acceleration term with respect to its (local) mean. The derivation of the method, its advantages over the heuristic method and issues related to its numerical implementation are discussed in detail. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Motivated by applications to distributed storage, Gopalan et al recently introduced the interesting notion of information-symbol locality in a linear code. By this it is meant that each message symbol appears in a parity-check equation associated with small Hamming weight, thereby enabling recovery of the message symbol by examining a small number of other code symbols. This notion is expanded to the case when all code symbols, not just the message symbols, are covered by such ``local'' parity. In this paper, we extend the results of Gopalan et. al. so as to permit recovery of an erased code symbol even in the presence of errors in local parity symbols. We present tight bounds on the minimum distance of such codes and exhibit codes that are optimal with respect to the local error-correction property. As a corollary, we obtain an upper bound on the minimum distance of a concatenated code.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In the present investigation, various kinds of textures were attained on the steel surfaces. Roughness of the textures was varied using different grits of emery papers or polishing powders. Pins made of pure Al, Al-4Mg alloy and pure Mg were then slid against prepared steel plate surfaces at various numbers of cycles using an inclined pin-on-plate sliding tester. Tests were conducted at a sliding velocity of 2mms(-1) in ambient conditions under both dry and lubricated conditions. Normal loads were increased up to 110N during the tests. The morphologies of the worn surfaces of the pins and the formation of transfer layer on the counter surfaces were observed using a scanning electron microscope. Surface roughness parameters of the plate were measured using an optical profilometer. In the experiments, it was observed that the coefficient of friction and formation of a transfer layer (under dry and lubricated conditions) only depended on surface texture during the first few sliding cycles. The steady-state variation in the coefficient of friction under both dry and lubrication conditions was attributed to the self-organisation of texture of the surfaces at the interface during sliding. Copyright (C) 2012 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using a Girsanov change of measures, we propose novel variations within a particle-filtering algorithm, as applied to the inverse problem of state and parameter estimations of nonlinear dynamical systems of engineering interest, toward weakly correcting for the linearization or integration errors that almost invariably occur whilst numerically propagating the process dynamics, typically governed by nonlinear stochastic differential equations (SDEs). Specifically, the correction for linearization, provided by the likelihood or the Radon-Nikodym derivative, is incorporated within the evolving flow in two steps. Once the likelihood, an exponential martingale, is split into a product of two factors, correction owing to the first factor is implemented via rejection sampling in the first step. The second factor, which is directly computable, is accounted for via two different schemes, one employing resampling and the other using a gain-weighted innovation term added to the drift field of the process dynamics thereby overcoming the problem of sample dispersion posed by resampling. The proposed strategies, employed as add-ons to existing particle filters, the bootstrap and auxiliary SIR filters in this work, are found to non-trivially improve the convergence and accuracy of the estimates and also yield reduced mean square errors of such estimates vis-a-vis those obtained through the parent-filtering schemes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

General circulation models (GCMs) are routinely used to simulate future climatic conditions. However, rainfall outputs from GCMs are highly uncertain in preserving temporal correlations, frequencies, and intensity distributions, which limits their direct application for downscaling and hydrological modeling studies. To address these limitations, raw outputs of GCMs or regional climate models are often bias corrected using past observations. In this paper, a methodology is presented for using a nested bias-correction approach to predict the frequencies and occurrences of severe droughts and wet conditions across India for a 48-year period (2050-2099) centered at 2075. Specifically, monthly time series of rainfall from 17 GCMs are used to draw conclusions for extreme events. An increasing trend in the frequencies of droughts and wet events is observed. The northern part of India and coastal regions show maximum increase in the frequency of wet events. Drought events are expected to increase in the west central, peninsular, and central northeast regions of India. (C) 2013 American Society of Civil Engineers.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a strong relation between sparse signal recovery and error control coding. It is known that burst errors are block sparse in nature. So, here we attempt to solve burst error correction problem using block sparse signal recovery methods. We construct partial Fourier based encoding and decoding matrices using results on difference sets. These constructions offer guaranteed and efficient error correction when used in conjunction with reconstruction algorithms which exploit block sparsity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Propranolol, a beta-adrenergic receptor blocker, is presently considered to be a potential therapeutic intervention under investigation for its role in prevention and treatment of osteoporosis. However, no studies have compared the osteoprotective properties of propranolol with well accepted therapeu-tic interventions for the treatment of osteoporosis. To address this question, this study was designed to evaluate the bone protective effects of zoledronic acid, alfacalcidol and propranolol in an animal model of postmenopausal osteoporosis. Five days after ovariectomy, 36 ovariectomized (OVX) rats were divided in- to 6 equal groups, randomized to treatments zoledronic acid (100 μg/kg, intravenous single dose); alfacal-cidol (0.5 μg/kg, oral gauge daily); propranolol (0.1mg/kg, subcutaneously 5 days per week) for 12 weeks. Untreated OVX and sham OVX were used as controls. At the end of the study, rats were killed under anesthesia. For bone porosity evaluation, whole fourth lumbar vertebrae (LV4) were removed. LV4 were also used to measure bone mechanical propeties. Left femurs were used for bone histology. Propranolol showed a significant decrease in bone porosity in comparison to OVX control. Moreover, propranolol sig- nificantly improved bone mechanical properties and bone quality when compared with OVX control. The osteoprotective effect of propranolol was comparable with zoledronic acid and alfacalcidol. Based on this comparative study, the results strongly suggest that propranolol might be new therapeutic intervention for the management of postmenopausal osteoporosis in humans.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, a combined forming and fracture limit diagram, fractured void coalescence and texture analysis have been experimentally evaluated for the commercially available aluminum alloy Al 8011 sheet annealed at different temperatures viz. 200 degrees C, 250 degrees C, 300 degrees C and 350 degrees C. The sheets were examined at different annealing temperatures on microstructure, tensile properties, formability and void coalescence. The fractured surfaces of the formed samples were examined using scanning electron microscope (SEM) and these images were correlated with fracture behavior and formability of sheet metals. Formability of Al 8011 was studied and examined at various annealing temperatures using their bulk X-ray crystallographic textures and ODF plots. Forming limit diagrams, void coalescence parameters and crystallographic textures were correlated with normal anisotropy of the sheet metals annealed at different temperatures. (C) 2013 Politechnika Wroclawska. Published by Elsevier Urban & Partner Sp. z o.o. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regenerating codes and codes with locality are two coding schemes that have recently been proposed, which in addition to ensuring data collection and reliability, also enable efficient node repair. In a situation where one is attempting to repair a failed node, regenerating codes seek to minimize the amount of data downloaded for node repair, while codes with locality attempt to minimize the number of helper nodes accessed. This paper presents results in two directions. In one, this paper extends the notion of codes with locality so as to permit local recovery of an erased code symbol even in the presence of multiple erasures, by employing local codes having minimum distance >2. An upper bound on the minimum distance of such codes is presented and codes that are optimal with respect to this bound are constructed. The second direction seeks to build codes that combine the advantages of both codes with locality as well as regenerating codes. These codes, termed here as codes with local regeneration, are codes with locality over a vector alphabet, in which the local codes themselves are regenerating codes. We derive an upper bound on the minimum distance of vector-alphabet codes with locality for the case when their constituent local codes have a certain uniform rank accumulation property. This property is possessed by both minimum storage regeneration (MSR) and minimum bandwidth regeneration (MBR) codes. We provide several constructions of codes with local regeneration which achieve this bound, where the local codes are either MSR or MBR codes. Also included in this paper, is an upper bound on the minimum distance of a general vector code with locality as well as the performance comparison of various code constructions of fixed block length and minimum distance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, we applied the integration methodology developed in the companion paper by Aires (2014) by using real satellite observations over the Mississippi Basin. The methodology provides basin-scale estimates of the four water budget components (precipitation P, evapotranspiration E, water storage change Delta S, and runoff R) in a two-step process: the Simple Weighting (SW) integration and a Postprocessing Filtering (PF) that imposes the water budget closure. A comparison with in situ observations of P and E demonstrated that PF improved the estimation of both components. A Closure Correction Model (CCM) has been derived from the integrated product (SW+PF) that allows to correct each observation data set independently, unlike the SW+PF method which requires simultaneous estimates of the four components. The CCM allows to standardize the various data sets for each component and highly decrease the budget residual (P - E - Delta S - R). As a direct application, the CCM was combined with the water budget equation to reconstruct missing values in any component. Results of a Monte Carlo experiment with synthetic gaps demonstrated the good performances of the method, except for the runoff data that has a variability of the same order of magnitude as the budget residual. Similarly, we proposed a reconstruction of Delta S between 1990 and 2002 where no Gravity Recovery and Climate Experiment data are available. Unlike most of the studies dealing with the water budget closure at the basin scale, only satellite observations and in situ runoff measurements are used. Consequently, the integrated data sets are model independent and can be used for model calibration or validation.