943 resultados para Weighted average power tests


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this study was to verify the correlation between the Wingate arm crank test outputs (peak power, mean power, and fatigue index), obtained on a specific ergometer, and the performance in crawl stroke swim sprints of 14, 25, 50, and 400 m. The experiment was conducted with 9 healthy male volunteers (18.1 ± 2.2 years of age; 172 ± 0.04 cm; 67.7 ± 5.92 kg and 15.7 ± 4.57% body fat). On determined days, all individuals were submitted to the Wingate arm crank test and crawl freestyle sprints of 14, 25, 50, and 400 m as they were timed with a stopwatch. The peak power, the mean power, and the fatigue index, which were obtained during the Wingate arm crank test, were not significantly correlated with the maximum swim velocities during the crawl freestyle tests of 14 (r = 0.40; r = 0.64; r = 0.11), 25 (r = 0.28; r = 0.39; r = -0.17), 50 (r = 0.03; r = 0.09; r = -0.31), and 400 (r = -0.52; r = -0.37; r = -0.65) m, respectively. Thus, it is possible to conclude that the Wingate arm crank test is not suitable to assess the anaerobic power of swimmers under the described experimental conditions.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper proposes a filter based on a general regression neural network and a moving average filter, for preprocessing half-hourly load data for short-term multinodal load forecasting, discussed in another paper. Tests made with half-hourly load data from nine New Zealand electrical substations demonstrate that this filter is able to handle noise, missing data and abnormal data. © 2011 IEEE.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The resistive-type superconducting fault current limiters (RSFCL) prototypes using YBCO-coated conductors have shown current limitation for medium voltage class applications for acting time up to 80 ms. By connecting an air-core reactor in parallel with the RSFCL, thus making an hybrid current limiter, one can extend the acting time for up to 1 s. In this work, we report the performance of a hybrid current limiter subjected to an AC peak fault current of 2 kA during 1 s for which within the first 80 ms the SFCL limits the current concurrently with the air-core reactor, and for the remaining 920 ms, only the air-core reactor limits the current. In order to evaluate the actual conditions for subsequent reconnection of RSFCL to the power grid, the hybrid fault current limiter was tested varying the time interval for recovery from 900 ms and 1.2 s, followed again by the concurrent operation of the hybrid limiter during 1 s (SFCL during 80 ms). From this evaluation test, the recovery time can be measured and compared using the voltage peak generated in superconducting module from the first and second fault test. The recovery time was also determined through the pulsed current method (PCM) on short-length sample test. The results showed that the fault current was limited from 1.9 kA down to 514 A after 1 cycle of 60 Hz frequency, with recovery time lower than 1.2 s for two subsequent fault current tests.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This paper examines the local power of the likelihood ratio, Wald, score and gradient tests under the presence of a scalar parameter, phi say, that is orthogonal to the remaining parameters. We show that some of the coefficients that define the local powers remain unchanged regardless of whether phi is known or needs to be estimated, where as the others can be written as the sum of two terms, the first of which being the corresponding term obtained as if phi were known, and the second, an additional term yielded by the fact that phi is unknown. The contribution of each set of parameters on the local powers of the tests can then be examined. Various implications of our main result are stated and discussed. Several examples are presented for illustrative purposes

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In this paper we obtain asymptotic expansions, up to order n(-1/2) and under a sequence of Pitman alternatives, for the nonnull distribution functions of the likelihood ratio, Wald, score and gradient test statistics in the class of symmetric linear regression models. This is a wide class of models which encompasses the t model and several other symmetric distributions with longer-than normal tails. The asymptotic distributions of all four statistics are obtained for testing a subset of regression parameters. Furthermore, in order to compare the finite-sample performance of these tests in this class of models, Monte Carlo simulations are presented. An empirical application to a real data set is considered for illustrative purposes. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We derive asymptotic expansions for the nonnull distribution functions of the likelihood ratio, Wald, score and gradient test statistics in the class of dispersion models, under a sequence of Pitman alternatives. The asymptotic distributions of these statistics are obtained for testing a subset of regression parameters and for testing the precision parameter. Based on these nonnull asymptotic expansions, the power of all four tests, which are equivalent to first order, are compared. Furthermore, in order to compare the finite-sample performance of these tests in this class of models, Monte Carlo simulations are presented. An empirical application to a real data set is considered for illustrative purposes. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Several MCAO systems are under study to improve the angular resolution of the current and of the future generation large ground-based telescopes (diameters in the 8-40 m range). The subject of this PhD Thesis is embedded in this context. Two MCAO systems, in dierent realization phases, are addressed in this Thesis: NIRVANA, the 'double' MCAO system designed for one of the interferometric instruments of LBT, is in the integration and testing phase; MAORY, the future E-ELT MCAO module, is under preliminary study. These two systems takle the sky coverage problem in two dierent ways. The layer oriented approach of NIRVANA, coupled with multi-pyramids wavefront sensors, takes advantage of the optical co-addition of the signal coming from up to 12 NGS in a annular 2' to 6' technical FoV and up to 8 in the central 2' FoV. Summing the light coming from many natural sources permits to increase the limiting magnitude of the single NGS and to improve considerably the sky coverage. One of the two Wavefront Sensors for the mid- high altitude atmosphere analysis has been integrated and tested as a stand- alone unit in the laboratory at INAF-Osservatorio Astronomico di Bologna and afterwards delivered to the MPIA laboratories in Heidelberg, where was integrated and aligned to the post-focal optical relay of one LINC-NIRVANA arm. A number of tests were performed in order to characterize and optimize the system functionalities and performance. A report about this work is presented in Chapter 2. In the MAORY case, to ensure correction uniformity and sky coverage, the LGS-based approach is the current baseline. However, since the Sodium layer is approximately 10 km thick, the articial reference source looks elongated, especially when observed from the edge of a large aperture. On a 30-40 m class telescope, for instance, the maximum elongation varies between few arcsec and 10 arcsec, depending on the actual telescope diameter, on the Sodium layer properties and on the laser launcher position. The centroiding error in a Shack-Hartmann WFS increases proportionally to the elongation (in a photon noise dominated regime), strongly limiting the performance. To compensate for this effect a straightforward solution is to increase the laser power, i.e. to increase the number of detected photons per subaperture. The scope of Chapter 3 is twofold: an analysis of the performance of three dierent algorithms (Weighted Center of Gravity, Correlation and Quad-cell) for the instantaneous LGS image position measurement in presence of elongated spots and the determination of the required number of photons to achieve a certain average wavefront error over the telescope aperture. An alternative optical solution to the spot elongation problem is proposed in Section 3.4. Starting from the considerations presented in Chapter 3, a first order analysis of the LGS WFS for MAORY (number of subapertures, number of detected photons per subaperture, RON, focal plane sampling, subaperture FoV) is the subject of Chapter 4. An LGS WFS laboratory prototype was designed to reproduce the relevant aspects of an LGS SH WFS for the E-ELT and to evaluate the performance of different centroid algorithms in presence of elongated spots as investigated numerically and analytically in Chapter 3. This prototype permits to simulate realistic Sodium proles. A full testing plan for the prototype is set in Chapter 4.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There are numerous statistical methods for quantitative trait linkage analysis in human studies. An ideal such method would have high power to detect genetic loci contributing to the trait, would be robust to non-normality in the phenotype distribution, would be appropriate for general pedigrees, would allow the incorporation of environmental covariates, and would be appropriate in the presence of selective sampling. We recently described a general framework for quantitative trait linkage analysis, based on generalized estimating equations, for which many current methods are special cases. This procedure is appropriate for general pedigrees and easily accommodates environmental covariates. In this paper, we use computer simulations to investigate the power robustness of a variety of linkage test statistics built upon our general framework. We also propose two novel test statistics that take account of higher moments of the phenotype distribution, in order to accommodate non-normality. These new linkage tests are shown to have high power and to be robust to non-normality. While we have not yet examined the performance of our procedures in the context of selective sampling via computer simulations, the proposed tests satisfy all of the other qualities of an ideal quantitative trait linkage analysis method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

We evaluated the muscular strength, endurance, and power responses of 12 college students, ranging in age from 19-40 years, who participated in a 6-wk high-intensity training program commonly used to improve muscular endurance. Muscular strength was measured by a one repetition maximum (1RM) bench press test and a 1RM Hammer bench press test; muscular endurance was measured by administering a 70-percent 1RM test to failure on the Hammer bench press; and upper body power was measured by adminstering a medicine ball throw test. We observed a 4.8-percent improvement of 2.7 kg on the bench press, a 14.6-percent improvement of 10.5 kg on the Hammer bench press, a 45.5-percent improvement with an average increase of five repetitions on the submaximal test to failure and an average improvement of ~ 20 percent, 60 cm, for the medicine ball throw. Foe our subjects, a commonly used high-intensity training muscular endurance program resulted in improved performance on tests measuring muscular strength, endurance, and power, and resulted in zero reported injuries during training or assessment procedures.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Sizes and power of selected two-sample tests of the equality of survival distributions are compared by simulation for small samples from unequally, randomly-censored exponential distributions. The tests investigated include parametric tests (F, Score, Likelihood, Asymptotic), logrank tests (Mantel, Peto-Peto), and Wilcoxon-Type tests (Gehan, Prentice). Equal sized samples, n = 18, 16, 32 with 1000 (size) and 500 (power) simulation trials, are compared for 16 combinations of the censoring proportions 0%, 20%, 40%, and 60%. For n = 8 and 16, the Asymptotic, Peto-Peto, and Wilcoxon tests perform at nominal 5% size expectations, but the F, Score and Mantel tests exceeded 5% size confidence limits for 1/3 of the censoring combinations. For n = 32, all tests showed proper size, with the Peto-Peto test most conservative in the presence of unequal censoring. Powers of all tests are compared for exponential hazard ratios of 1.4 and 2.0. There is little difference in power characteristics of the tests within the classes of tests considered. The Mantel test showed 90% to 95% power efficiency relative to parametric tests. Wilcoxon-type tests have the lowest relative power but are robust to differential censoring patterns. A modified Peto-Peto test shows power comparable to the Mantel test. For n = 32, a specific Weibull-exponential comparison of crossing survival curves suggests that the relative powers of logrank and Wilcoxon-type tests are dependent on the scale parameter of the Weibull distribution. Wilcoxon-type tests appear more powerful than logrank tests in the case of late-crossing and less powerful for early-crossing survival curves. Guidelines for the appropriate selection of two-sample tests are given. ^

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Territory or zone design processes entail partitioning a geographic space, organized as a set of areal units, into different regions or zones according to a specific set of criteria that are dependent on the application context. In most cases, the aim is to create zones of approximately equal sizes (zones with equal numbers of inhabitants, same average sales, etc.). However, some of the new applications that have emerged, particularly in the context of sustainable development policies, are aimed at defining zones of a predetermined, though not necessarily similar, size. In addition, the zones should be built around a given set of seeds. This type of partitioning has not been sufficiently researched; therefore, there are no known approaches for automated zone delimitation. This study proposes a new method based on a discrete version of the adaptive additively weighted Voronoi diagram that makes it possible to partition a two-dimensional space into zones of specific sizes, taking both the position and the weight of each seed into account. The method consists of repeatedly solving a traditional additively weighted Voronoi diagram, so that each seed?s weight is updated at every iteration. The zones are geographically connected using a metric based on the shortest path. Tests conducted on the extensive farming system of three municipalities in Castile-La Mancha (Spain) have established that the proposed heuristic procedure is valid for solving this type of partitioning problem. Nevertheless, these tests confirmed that the given seed position determines the spatial configuration the method must solve and this may have a great impact on the resulting partition.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This study has three main objectives. First, it develops a generalization of the commonly used EKS method to multilateral price comparisons. It is shown that the EKS system can be generalized so that weights can be attached to each of the link comparisons used in the EKS computations. These weights can account for differing levels of reliability of the underlying binary comparisons. Second, various reliability measures and corresponding weighting schemes are presented and their merits discussed. Finally, these new methods are applied to an international data set of manufacturing prices from the ICOP project. Although theoretically superior, it appears that the empirical impact of the weighted EKS method is generally small compared to the unweighted EKS. It is also found that this impact is larger when it is applied at lower levels of aggregation. Finally, the importance of using sector specific PPPs in assessing relative levels of manufacturing productivity is indicated.