278 resultados para Summed estimation scales


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Electromagnetic Articulography (EMA) technique is used to record the kinematics of different articulators while one speaks. EMA data often contains missing segments due to sensor failure. In this work, we propose a maximum a-posteriori (MAP) estimation with continuity constraint to recover the missing samples in the articulatory trajectories recorded using EMA. In this approach, we combine the benefits of statistical MAP estimation as well as the temporal continuity of the articulatory trajectories. Experiments on articulatory corpus using different missing segment durations show that the proposed continuity constraint results in a 30% reduction in average root mean squared error in estimation over statistical estimation of missing segments without any continuity constraint.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Materials with widely varying molecular topologies and exhibiting liquid crystalline properties have attracted considerable attention in recent years. C-13 NMR spectroscopy is a convenient method for studying such novel systems. In this approach the assignment of the spectrum is the first step which is a non-trivial problem. Towards this end, we propose here a method that enables the carbon skeleton of the different sub-units of the molecule to be traced unambiguously. The proposed method uses a heteronuclear correlation experiment to detect pairs of nearby carbons with attached protons in the liquid crystalline core through correlation of the carbon chemical shifts to the double-quantum coherences of protons generated through the dipolar coupling between them. Supplemented by experiments that identify non-protonated carbons, the method leads to a complete assignment of the spectrum. We initially apply this method for assigning the C-13 spectrum of the liquid crystal 4-n-pentyl-4'-cyanobiphenyl oriented in the magnetic field. We then utilize the method to assign the aromatic carbon signals of a thiophene based liquid crystal thereby enabling the local order-parameters of the molecule to be estimated and the mutual orientation of the different sub-units to be obtained.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A simple method employing an optical probe is presented to measure density variations in a hypersonic flow obstructed by a test model in a typical shock tunnel. The probe has a plane light wave trans-illuminating the flow and casting a shadow of a random dot pattern. Local slopes of the distorted wavefront are obtained from shifts of the dots in the pattern. Local shifts in the dots are accurately measured by cross-correlating local shifted shadows with the corresponding unshifted originals. The measured slopes are suitably unwrapped by using a discrete cosine transform based phase unwrapping procedure and also through iterative procedures. The unwrapped phase information is used in an iterative scheme for a full quantitative recovery of density distribution in the shock around the model through refraction tomographic inversion. Hypersonic flow field parameters around a missile shaped body at a free-stream Mach number of 5.8 measured using this technique are compared with the numerically estimated values. (C) 2014 Society of Photo-Optical Instrumentation Engineers (SPIE)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Storage of water within a river basin is often estimated by analyzing recession flow curves as it cannot be `instantly' estimated with the aid of available technologies. In this study we explicitly deal with the issue of estimation of `drainable' storage, which is equal to the area under the `complete' recession flow curve (i.e. a discharge vs. time curve where discharge continuously decreases till it approaches zero). But a major challenge in this regard is that recession curves are rarely `complete' due to short inter-storm time intervals. Therefore, it is essential to analyze and model recession flows meaningfully. We adopt the wellknown Brutsaert and Nieber analytical method that expresses time derivative of discharge (dQ/dt) as a power law function of Q : -dQ/dt = kQ(alpha). However, the problem with dQ/dt-Q analysis is that it is not suitable for late recession flows. Traditional studies often compute alpha considering early recession flows and assume that its value is constant for the whole recession event. But this approach gives unrealistic results when alpha >= 2, a common case. We address this issue here by using the recently proposed geomorphological recession flow model (GRFM) that exploits the dynamics of active drainage networks. According to the model, alpha is close to 2 for early recession flows and 0 for late recession flows. We then derive a simple expression for drainable storage in terms the power law coefficient k, obtained by considering early recession flows only, and basin area. Using 121 complete recession curves from 27 USGS basins we show that predicted drainable storage matches well with observed drainable storage, indicating that the model can also reliably estimate drainable storage for `incomplete' recession events to address many challenges related to water resources. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The estimation of water and solute transit times in catchments is crucial for predicting the response of hydrosystems to external forcings (climatic or anthropogenic). The hydrogeochemical signatures of tracers (either natural or anthropogenic) in streams have been widely used to estimate transit times in catchments as they integrate the various processes at stake. However, most of these tracers are well suited for catchments with mean transit times lower than about 4-5 years. Since the second half of the 20th century, the intensification of agriculture led to a general increase of the nitrogen load in rivers. As nitrate is mainly transported by groundwater in agricultural catchments, this signal can be used to estimate transit times greater than several years, even if nitrate is not a conservative tracer. Conceptual hydrological models can be used to estimate catchment transit times provided their consistency is demonstrated, based on their ability to simulate the stream chemical signatures at various time scales and catchment internal processes such as N storage in groundwater. The objective of this study was to assess if a conceptual lumped model was able to simulate the observed patterns of nitrogen concentration, at various time scales, from seasonal to pluriannual and thus if it was relevant to estimate the nitrogen transit times in headwater catchments. A conceptual lumped model, representing shallow groundwater flow as two parallel linear stores with double porosity, and riparian processes by a constant nitrogen removal function, was applied on two paired agricultural catchments which belong to the Research Observatory ORE AgrHys. The Global Likelihood Uncertainty Estimation (GLUE) approach was used to estimate parameter values and uncertainties. The model performance was assessed on (i) its ability to simulate the contrasted patterns of stream flow and stream nitrate concentrations at seasonal and inter-annual time scales, (ii) its ability to simulate the patterns observed in groundwater at the same temporal scales, and (iii) the consistency of long-term simulations using the calibrated model and the general pattern of the nitrate concentration increase in the region since the beginning of the intensification of agriculture in the 1960s. The simulated nitrate transit times were found more sensitive to climate variability than to parameter uncertainty, and average values were found to be consistent with results from others studies in the same region involving modeling and groundwater dating. This study shows that a simple model can be used to simulate the main dynamics of nitrogen in an intensively polluted catchment and then be used to estimate the transit times of these pollutants in the system which is crucial to guide mitigation plans design and assessment. (C) 2015 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimation of the municipal solid waste settlements and the contribution of each of the components are essential in the estimation of the volume of the waste that can be accommodated in a landfill and increase the post-usage of the landfill. This article describes an experimental methodology for estimating and separating primary settlement, settlement owing to creep and biodegradation-induced settlement. The primary settlement and secondary settlement have been estimated and separated based on 100% pore pressure dissipation time and the coefficient of consolidation. Mechanical creep and biodegradation settlements were estimated and separated based on the observed time required for landfill gas production. The results of a series of laboratory triaxial tests, creep tests and anaerobic reactor cell setups were conducted to describe the components of settlement. All the tests were conducted on municipal solid waste (compost reject) samples. It was observed that biodegradation accounted to more than 40% of the total settlement, whereas mechanical creep contributed more than 20% towards the total settlement. The essential model parameters, such as the compression ratio (C-c'), rate of mechanical creep (c), coefficient of mechanical creep (b), rate of biodegradation (d) and the total strain owing to biodegradation (E-DG), are useful parameters in the estimation of total settlements as well as components of settlement in landfill.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper addresses the problem of intercepting highly maneuverable threats using seeker-less interceptors that operate in the command guidance mode. These systems are more prone to estimation errors than standard seeker-based systems. In this paper, an integrated estimation/guidance (IEG) algorithm, which combines interactive multiple model (IMM) estimator with differential game guidance law (DGL), is proposed for seeker-less interception. In this interception scenario, the target performs an evasive bang-bang maneuver, while the sensor has noisy measurements and the interceptor is subject to acceleration bound. The IMM serves as a basis for the synthesis of efficient filters for tracking maneuvering targets and reducing estimation errors. The proposed game-based guidance law for two-dimensional interception, later extended to three-dimensional interception scenarios, is used to improve the endgame performance of the command-guided seeker-less interceptor. The IMM scheme and an optimal selection of filters, to cater to various maneuvers that are expected during the endgame, are also described. Furthermore, a chatter removal algorithm is introduced, thus modifying the differential game guidance law (modified DGL). A comparison between modified DGL guidance law and conventional proportional navigation guidance law demonstrates significant improvement in miss distance in a pursuer-evader scenario. Simulation results are also presented for varying flight path angle errors. A numerical study is provided which demonstrates the performance of the combined interactive multiple model with game-based guidance law (IMM/DGL). Simulation study is also carried out for combined IMM and modified DGL (IMM/modified DGL) which exhibits the superior performance and viability of the algorithm reducing the chattering phenomenon. The results are illustrated by an extensive Monte Carlo simulation study in the presence of estimation errors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Major drawback of studying diffusion in multi-component systems is the lack of suitable techniques to estimate the diffusion parameters. In this study, a generalized treatment to determine the intrinsic diffusion coefficients in multi-component systems is developed utilizing the concept of a pseudo-binary approach. This is explained with the help of experimentally developed diffusion profiles in the Cu(Sn, Ga) and Cu(Sn, Si) solid solutions. (C) 2015 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The inversion of canopy reflectance models is widely used for the retrieval of vegetation properties from remote sensing. This study evaluates the retrieval of soybean biophysical variables of leaf area index, leaf chlorophyll content, canopy chlorophyll content, and equivalent leaf water thickness from proximal reflectance data integrated broadbands corresponding to moderate resolution imaging spectroradiometer, thematic mapper, and linear imaging self scanning sensors through inversion of the canopy radiative transfer model, PROSAIL. Three different inversion approaches namely the look-up table, genetic algorithm, and artificial neural network were used and performances were evaluated. Application of the genetic algorithm for crop parameter retrieval is a new attempt among the variety of optimization problems in remote sensing which have been successfully demonstrated in the present study. Its performance was as good as that of the look-up table approach and the artificial neural network was a poor performer. The general order of estimation accuracy for para-meters irrespective of inversion approaches was leaf area index > canopy chlorophyll content > leaf chlorophyll content > equivalent leaf water thickness. Performance of inversion was comparable for broadband reflectances of all three sensors in the optical region with insignificant differences in estimation accuracy among them.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Probable maximum precipitation (PMP) is a theoretical concept that is widely used by hydrologists to arrive at estimates for probable maximum flood (PMF) that find use in planning, design and risk assessment of high-hazard hydrological structures such as flood control dams upstream of populated areas. The PMP represents the greatest depth of precipitation for a given duration that is meteorologically possible for a watershed or an area at a particular time of year, with no allowance made for long-term climatic trends. Various methods are in use for estimation of PMP over a target location corresponding to different durations. Moisture maximization method and Hershfield method are two widely used methods. The former method maximizes the observed storms assuming that the atmospheric moisture would rise up to a very high value estimated based on the maximum daily dew point temperature. On the other hand, the latter method is a statistical method based on a general frequency equation given by Chow. The present study provides one-day PMP estimates and PMP maps for Mahanadi river basin based on the aforementioned methods. There is a need for such estimates and maps, as the river basin is prone to frequent floods. Utility of the constructed PMP maps in computing PMP for various catchments in the river basin is demonstrated. The PMP estimates can eventually be used to arrive at PMF estimates for those catchments. (C) 2015 The Authors. Published by Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Regional frequency analysis is widely used for estimating quantiles of hydrological extreme events at sparsely gauged/ungauged target sites in river basins. It involves identification of a region (group of watersheds) resembling watershed of the target site, and use of information pooled from the region to estimate quantile for the target site. In the analysis, watershed of the target site is assumed to completely resemble watersheds in the identified region in terms of mechanism underlying generation of extreme event. In reality, it is rare to find watersheds that completely resemble each other. Fuzzy clustering approach can account for partial resemblance of watersheds and yield region(s) for the target site. Formation of regions and quantile estimation requires discerning information from fuzzy-membership matrix obtained based on the approach. Practitioners often defuzzify the matrix to form disjoint clusters (regions) and use them as the basis for quantile estimation. The defuzzification approach (DFA) results in loss of information discerned on partial resemblance of watersheds. The lost information cannot be utilized in quantile estimation, owing to which the estimates could have significant error. To avert the loss of information, a threshold strategy (TS) was considered in some prior studies. In this study, it is analytically shown that the strategy results in under-prediction of quantiles. To address this, a mathematical approach is proposed in this study and its effectiveness in estimating flood quantiles relative to DFA and TS is demonstrated through Monte-Carlo simulation experiments and case study on Mid-Atlantic water resources region, USA. (C) 2015 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monte Carlo simulation methods involving splitting of Markov chains have been used in evaluation of multi-fold integrals in different application areas. We examine in this paper the performance of these methods in the context of evaluation of reliability integrals from the point of view of characterizing the sampling fluctuations. The methods discussed include the Au-Beck subset simulation, Holmes-Diaconis-Ross method, and generalized splitting algorithm. A few improvisations based on first order reliability method are suggested to select algorithmic parameters of the latter two methods. The bias and sampling variance of the alternative estimators are discussed. Also, an approximation to the sampling distribution of some of these estimators is obtained. Illustrative examples involving component and series system reliability analyses are presented with a view to bring out the relative merits of alternative methods. (C) 2015 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The study follows an approach to estimate phytomass using recent techniques of remote sensing and digital photogrammetry. It involved tree inventory of forest plantations in Bhakra forest range of Nainital district. Panchromatic stereo dataset of Cartosat-1 was evaluated for mean stand height retrieval. Texture analysis and tree-tops detection analyses were done on Quick-Bird PAN data. The composite texture image of mean, variance and contrast with a 5x5 pixel window was found best to separate tree crowns for assessment of crown areas. Tree tops count obtained by local maxima filtering was found to be 83.4 % efficient with an RMSE+/-13 for 35 sample plots. The predicted phytomass ranged from 27.01 to 35.08 t/ha in the case of Eucalyptus sp. while in the case of Tectona grandis from 26.52 to 156 t/ha. The correlation between observed and predicted phytomass in Eucalyptus sp. was 0.468 with an RMSE of 5.12. However, the phytomass predicted in Tectona grandis was fairly strong with R-2=0.65 and RMSE of 9.89 as there was no undergrowth and the crowns were clearly visible. Results of the study show the potential of Cartosat-1 derived DSM and Quick-Bird texture image for the estimation of stand height, stem diameter, tree count and phytomass of important timber species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The main objective of the paper is to develop a new method to estimate the maximum magnitude (M (max)) considering the regional rupture character. The proposed method has been explained in detail and examined for both intraplate and active regions. Seismotectonic data has been collected for both the regions, and seismic study area (SSA) map was generated for radii of 150, 300, and 500 km. The regional rupture character was established by considering percentage fault rupture (PFR), which is the ratio of subsurface rupture length (RLD) to total fault length (TFL). PFR is used to arrive RLD and is further used for the estimation of maximum magnitude for each seismic source. Maximum magnitude for both the regions was estimated and compared with the existing methods for determining M (max) values. The proposed method gives similar M (max) value irrespective of SSA radius and seismicity. Further seismicity parameters such as magnitude of completeness (M (c) ), ``a'' and ``aEuro parts per thousand b `` parameters and maximum observed magnitude (M (max) (obs) ) were determined for each SSA and used to estimate M (max) by considering all the existing methods. It is observed from the study that existing deterministic and probabilistic M (max) estimation methods are sensitive to SSA radius, M (c) , a and b parameters and M (max) (obs) values. However, M (max) determined from the proposed method is a function of rupture character instead of the seismicity parameters. It was also observed that intraplate region has less PFR when compared to active seismic region.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Event-triggered sampling (ETS) is a new approach towards efficient signal analysis. The goal of ETS need not be only signal reconstruction, but also direct estimation of desired information in the signal by skillful design of event. We show a promise of ETS approach towards better analysis of oscillatory non-stationary signals modeled by a time-varying sinusoid, when compared to existing uniform Nyquist-rate sampling based signal processing. We examine samples drawn using ETS, with events as zero-crossing (ZC), level-crossing (LC), and extrema, for additive in-band noise and jitter in detection instant. We find that extrema samples are robust, and also facilitate instantaneous amplitude (IA), and instantaneous frequency (IF) estimation in a time-varying sinusoid. The estimation is proposed solely using extrema samples, and a local polynomial regression based least-squares fitting approach. The proposed approach shows improvement, for noisy signals, over widely used analytic signal, energy separation, and ZC based approaches (which are based on uniform Nyquist-rate sampling based data-acquisition and processing). Further, extrema based ETS in general gives a sub-sampled representation (relative to Nyquistrate) of a time-varying sinusoid. For the same data-set size captured with extrema based ETS, and uniform sampling, the former gives much better IA and IF estimation. (C) 2015 Elsevier B.V. All rights reserved.