947 resultados para Test method


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Determination of reliable solute transport parameters is an essential aspect for the characterization of the mechanisms and processes involved in solute transport (e.g., pesticides, fertilizers, contaminants) through the unsaturated zone. A rapid inexpensive method to estimate the dispersivity parameter at the field scale is presented herein. It is based on the quantification by the X-ray fluorescence solid-state technique of total bromine in soil, along with an inverse numerical modeling approach. The results show that this methodology is a good alternative to the classic Br− determination in soil water by ion chromatography. A good agreement between the observed and simulated total soil Br is reported. The results highlight the potential applicability of both combined techniques to infer readily solute transport parameters under field conditions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of 3D data in mobile robotics applications provides valuable information about the robot’s environment. However usually the huge amount of 3D information is difficult to manage due to the fact that the robot storage system and computing capabilities are insufficient. Therefore, a data compression method is necessary to store and process this information while preserving as much information as possible. A few methods have been proposed to compress 3D information. Nevertheless, there does not exist a consistent public benchmark for comparing the results (compression level, distance reconstructed error, etc.) obtained with different methods. In this paper, we propose a dataset composed of a set of 3D point clouds with different structure and texture variability to evaluate the results obtained from 3D data compression methods. We also provide useful tools for comparing compression methods, using as a baseline the results obtained by existing relevant compression methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This case study evaluates the implementation of a secondary land use plan in Winnipeg, MB. The area selected for this case study is the Northeast Neighbourhood located in Waverley West; the development of this neighbourhood was guided by the Northeast Neighbourhood Area Structure Plan (NNASP). This case study evaluates the implementation of the NNASP through a conformance analysis which answers the following research questions: 1) Does the developed land use pattern in the NNASP area conform to what was planned; and 2) Does the implementation of the NNASP conform to the goals, objectives, policies, and intent of the plan? The implementation of the NNASP was evaluated against 62 evaluation criteria which were generated based on the policies of the NNASP. Using this method, the development of the Northeast Neighbourhood is effectively evaluated against the requirements of the NNASP. This conformity test utilized threefold approach including GIS analysis, a site visit, and document analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many multifactorial biologic effects, particularly in the context of complex human diseases, are still poorly understood. At the same time, the systematic acquisition of multivariate data has become increasingly easy. The use of such data to analyze and model complex phenotypes, however, remains a challenge. Here, a new analytic approach is described, termed coreferentiality, together with an appropriate statistical test. Coreferentiality is the indirect relation of two variables of functional interest in respect to whether they parallel each other in their respective relatedness to multivariate reference data, which can be informative for a complex effect or phenotype. It is shown that the power of coreferentiality testing is comparable to multiple regression analysis, sufficient even when reference data are informative only to a relatively small extent of 2.5%, and clearly exceeding the power of simple bivariate correlation testing. Thus, coreferentiality testing uses the increased power of multivariate analysis, however, in order to address a more straightforward interpretable bivariate relatedness. Systematic application of this approach could substantially improve the analysis and modeling of complex phenotypes, particularly in the context of human study where addressing functional hypotheses by direct experimentation is often difficult.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Questions of "viability" evaluation of innovation projects are considered in this article. As a method of evaluation Hidden Markov Models are used. Problem of determining model parameters, which reproduce test data with highest accuracy are solving. For training the model statistical data on the implementation of innovative projects are used. Baum-Welch algorithm is used as a training algorithm.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two different slug test field methods are conducted in wells completed in a Puget Lowland aquifer and are examined for systematic error resulting from water column displacement techniques. Slug tests using the standard slug rod and the pneumatic method were repeated on the same wells and hydraulic conductivity estimates were calculated according to Bouwer & Rice and Hvorslev before using a non-parametric statistical test for analysis. Practical considerations of performing the tests in real life settings are also considered in the method comparison. Statistical analysis indicates that the slug rod method results in up to 90% larger hydraulic conductivity values than the pneumatic method, with at least a 95% certainty that the error is method related. This confirms the existence of a slug-rod bias in a real world scenario which has previously been demonstrated by others in synthetic aquifers. In addition to more accurate values, the pneumatic method requires less field labor, less decontamination, and provides the ability to control the magnitudes of the initial displacement, making it the superior slug test procedure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Ph.D.)--University of Washington, 2016-03

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present an efficient and robust method for the calculation of all S matrix elements (elastic, inelastic, and reactive) over an arbitrary energy range from a single real-symmetric Lanczos recursion. Our new method transforms the fundamental equations associated with Light's artificial boundary inhomogeneity approach [J. Chem. Phys. 102, 3262 (1995)] from the primary representation (original grid or basis representation of the Hamiltonian or its function) into a single tridiagonal Lanczos representation, thereby affording an iterative version of the original algorithm with greatly superior scaling properties. The method has important advantages over existing iterative quantum dynamical scattering methods: (a) the numerically intensive matrix propagation proceeds with real symmetric algebra, which is inherently more stable than its complex symmetric counterpart; (b) no complex absorbing potential or real damping operator is required, saving much of the exterior grid space which is commonly needed to support these operators and also removing the associated parameter dependence. Test calculations are presented for the collinear H+H-2 reaction, revealing excellent performance characteristics. (C) 2004 American Institute of Physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Purpose. A new method of dynamometry has been developed to measure the performance of the craniocervical (CC) flexor muscles by recording the torque that these muscles exert on the cranium around the CC junction. This report describes the method, the specifications of the instrument, and the preliminary reliability data. Subjects and Methods. For the reliability study, 20 subjects (12 subjects with a history of neck pain, 8 subjects without a history of neck pain) performed, on 2 occasions, maximal voluntary isometric contraction (MVIC) tests of CC flexion in 3 positions within the range of CC flexion and submaximal sustained tests (20% and 50% of MVIC) in the middle range of CC flexion (craniocervical neutral position). Reliability coefficients were calculated to establish the test-retest reliability of the measurements. Results. The method demonstrated good reliability over 2 sessions in the measurement of MVIC (intraclass correlation coefficient [ICC] =.79-.93, SEM=0.6-1.4 N-m) and in the measurement of steadiness (standard deviation of torque amplitude) of a sustained contraction at 20% of NMC (ICC=.74-.80, SEM=0.01 N-m), but not at 50% of MVIC (ICC=.07-.76, SEM=0.04-0.13 N-m). Discussion and Conclusion. The new dynamometry method appears to have potential clinical application in the measurement of craniocervical flexor muscle performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The performances of the gelatin particle agglutination test (GPAT) and enzyme-linked immunosorbent assay (ELISA) for the diagnosis of strongyloidiasis with reference to the results of the agar plate culture technique (APCT) were evaluated with samples from 459 individuals from communities in northeast Thailand where strongyloidiasis is endemic. The prevalence of strongyloidiasis in five sample groups determined by GPAT varied between 29.3 and 61.5% (mean, 38.8%). ELISA and APCT, employed concurrently, gave lower prevalence rates of 27.5% (range, 21.6 to 42.1%) and 22.7% (range, 12.7 to 53.8%), respectively. By using APCT as the standard method, the sensitivity of GPAT was generally higher than that of ELISA (81 versus 73%). The specificity of GPAT was slightly lower than that of ELISA (74 versus 86%). The resulting GPAT titers exhibited positive linear relationships with the ELISA values (optical density at 490 nm) (P < 0.05), which suggests that the GPAT titer also reflects the levels of specific antibody comparable to those reflected by the ELISA values. Based on the relative ease and simplicity of use of the technique as well as the acceptable rates of sensitivity and specificity of the test, GPAT is more practical for screening for strongyloidiasis than the conventional ELISA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we apply a new method for the determination of surface area of carbonaceous materials, using the local surface excess isotherms obtained from the Grand Canonical Monte Carlo simulation and a concept of area distribution in terms of energy well-depth of solid–fluid interaction. The range of this well-depth considered in our GCMC simulation is from 10 to 100 K, which is wide enough to cover all carbon surfaces that we dealt with (for comparison, the well-depth for perfect graphite surface is about 58 K). Having the set of local surface excess isotherms and the differential area distribution, the overall adsorption isotherm can be obtained in an integral form. Thus, given the experimental data of nitrogen or argon adsorption on a carbon material, the differential area distribution can be obtained from the inversion process, using the regularization method. The total surface area is then obtained as the area of this distribution. We test this approach with a number of data in the literature, and compare our GCMC-surface area with that obtained from the classical BET method. In general, we find that the difference between these two surface areas is about 10%, indicating the need to reliably determine the surface area with a very consistent method. We, therefore, suggest the approach of this paper as an alternative to the BET method because of the long-recognized unrealistic assumptions used in the BET theory. Beside the surface area obtained by this method, it also provides information about the differential area distribution versus the well-depth. This information could be used as a microscopic finger-print of the carbon surface. It is expected that samples prepared from different precursors and different activation conditions will have distinct finger-prints. We illustrate this with Cabot BP120, 280 and 460 samples, and the differential area distributions obtained from the adsorption of argon at 77 K and nitrogen also at 77 K have exactly the same patterns, suggesting the characteristics of this carbon.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The bispectrum and third-order moment can be viewed as equivalent tools for testing for the presence of nonlinearity in stationary time series. This is because the bispectrum is the Fourier transform of the third-order moment. An advantage of the bispectrum is that its estimator comprises terms that are asymptotically independent at distinct bifrequencies under the null hypothesis of linearity. An advantage of the third-order moment is that its values in any subset of joint lags can be used in the test, whereas when using the bispectrum the entire (or truncated) third-order moment is required to construct the Fourier transform. In this paper, we propose a test for nonlinearity based upon the estimated third-order moment. We use the phase scrambling bootstrap method to give a nonparametric estimate of the variance of our test statistic under the null hypothesis. Using a simulation study, we demonstrate that the test obtains its target significance level, with large power, when compared to an existing standard parametric test that uses the bispectrum. Further we show how the proposed test can be used to identify the source of nonlinearity due to interactions at specific frequencies. We also investigate implications for heuristic diagnosis of nonstationarity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Subsequent to the influential paper of [Chan, K.C., Karolyi, G.A., Longstaff, F.A., Sanders, A.B., 1992. An empirical comparison of alternative models of the short-term interest rate. Journal of Finance 47, 1209-1227], the generalised method of moments (GMM) has been a popular technique for estimation and inference relating to continuous-time models of the short-term interest rate. GMM has been widely employed to estimate model parameters and to assess the goodness-of-fit of competing short-rate specifications. The current paper conducts a series of simulation experiments to document the bias and precision of GMM estimates of short-rate parameters, as well as the size and power of [Hansen, L.P., 1982. Large sample properties of generalised method of moments estimators. Econometrica 50, 1029-1054], J-test of over-identifying restrictions. While the J-test appears to have appropriate size and good power in sample sizes commonly encountered in the short-rate literature, GMM estimates of the speed of mean reversion are shown to be severely biased. Consequently, it is dangerous to draw strong conclusions about the strength of mean reversion using GMM. In contrast, the parameter capturing the levels effect, which is important in differentiating between competing short-rate specifications, is estimated with little bias. (c) 2006 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper proposes a transmission and wheeling pricing method based on the monetary flow tracing along power flow paths: the monetary flow-monetary path method. Active and reactive power flows are converted into monetary flows by using nodal prices. The method introduces a uniform measurement for transmission service usages by active and reactive powers. Because monetary flows are related to the nodal prices, the impacts of generators and loads on operation constraints and the interactive impacts between active and reactive powers can be considered. Total transmission service cost is separated into more practical line-related costs and system-wide cost, and can be flexibly distributed between generators and loads. The method is able to reconcile transmission service cost fairly and to optimize transmission system operation and development. The case study on the IEEE 30 bus test system shows that the proposed pricing method is effective in creating economic signals towards the efficient use and operation of the transmission system. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: This paper compares four techniques used to assess change in neuropsychological test scores before and after coronary artery bypass graft surgery (CABG), and includes a rationale for the classification of a patient as overall impaired. Methods: A total of 55 patients were tested before and after surgery on the MicroCog neuropsychological test battery. A matched control group underwent the same testing regime to generate test–retest reliabilities and practice effects. Two techniques designed to assess statistical change were used: the Reliable Change Index (RCI), modified for practice, and the Standardised Regression-based (SRB) technique. These were compared against two fixed cutoff techniques (standard deviation and 20% change methods). Results: The incidence of decline across test scores varied markedly depending on which technique was used to describe change. The SRB method identified more patients as declined on most measures. In comparison, the two fixed cutoff techniques displayed relatively reduced sensitivity in the detection of change. Conclusions: Overall change in an individual can be described provided the investigators choose a rational cutoff based on likely spread of scores due to chance. A cutoff value of ≥20% of test scores used provided acceptable probability based on the number of tests commonly encountered. Investigators must also choose a test battery that minimises shared variance among test scores.