60 resultados para reliability algorithms

em University of Queensland eSpace - Australia


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We tested the effects of four data characteristics on the results of reserve selection algorithms. The data characteristics were nestedness of features (land types in this case), rarity of features, size variation of sites (potential reserves) and size of data sets (numbers of sites and features). We manipulated data sets to produce three levels, with replication, of each of these data characteristics while holding the other three characteristics constant. We then used an optimizing algorithm and three heuristic algorithms to select sites to solve several reservation problems. We measured efficiency as the number or total area of selected sites, indicating the relative cost of a reserve system. Higher nestedness increased the efficiency of all algorithms (reduced the total cost of new reserves). Higher rarity reduced the efficiency of all algorithms (increased the total cost of new reserves). More variation in site size increased the efficiency of all algorithms expressed in terms of total area of selected sites. We measured the suboptimality of heuristic algorithms as the percentage increase of their results over optimal (minimum possible) results. Suboptimality is a measure of the reliability of heuristics as indicative costing analyses. Higher rarity reduced the suboptimality of heuristics (increased their reliability) and there is some evidence that more size variation did the same for the total area of selected sites. We discuss the implications of these results for the use of reserve selection algorithms as indicative and real-world planning tools.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study determined the inter-tester and intra-tester reliability of physiotherapists measuring functional motor ability of traumatic brain injury clients using the Clinical Outcomes Variable Scale (COVS). To test inter-tester reliability, 14 physiotherapists scored the ability of 16 videotaped patients to execute the items that comprise the COVS. Intra-tester reliability was determined by four physiotherapists repeating their assessments after one week, and three months later. The intra-class correlation coefficients (ICC) were very high for both inter-tester reliability (ICC > 0.97 for total COVS scores, ICC > 0.93 for individual COVS items) and intra-tester reliability (ICC > 0.97). This study demonstrates that physiotherapists are reliable in the administration of the COVS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this study was to examine the reliability and validity of field tests for assessing physical function in mid-aged and young-old people (55–70 y). Tests were selected that required minimal space and equipment and could be implemented in multiple field settings such as a general practitioner's office. Nineteen participants completed 2 field and 1 laboratory testing sessions. Intra-class correlations showed good reliability for the tests of upper body strength (lift and reach, R= .66), lower body strength (sit to stand, R= .80) and functional capacity (Canadian Step Test, R= .92), but not for leg power (single timed chair rise, R= .28). There was also good reliability for the balance test during 3 stances: parallel (94.7% agreement), semi-tandem (73.7%), and tandem (52.6%). Comparison of field test results with objective laboratory measures found good validity for the sit to stand (cf 1RM leg press, Pearson r= .68, p< .05), and for the step test (cf PWC140, r= −.60, p< .001), but not for the lift and reach (cf 1RM bench press, r= .43, p> .05), balance (r= −.13, −.18, .23) and rate of force development tests (r= −.28). It was concluded that the lower body strength and cardiovascular function tests were appropriate for use in field settings with mid-aged and young-old adults.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Market-based transmission expansion planning gives information to investors on where is the most cost efficient place to invest and brings benefits to those who invest in this grid. However, both market issue and power system adequacy problems are system planers’ concern. In this paper, a hybrid probabilistic criterion of Expected Economical Loss (EEL) is proposed as an index to evaluate the systems’ overall expected economical losses during system operation in a competitive market. It stands on both investors’ and planner’s point of view and will further improves the traditional reliability cost. By applying EEL, it is possible for system planners to obtain a clear idea regarding the transmission network’s bottleneck and the amount of losses arises from this weak point. Sequentially, it enables planners to assess the worth of providing reliable services. Also, the EEL will contain valuable information for moneymen to undertake their investment. This index could truly reflect the random behaviors of power systems and uncertainties from electricity market. The performance of the EEL index is enhanced by applying Normalized Coefficient of Probability (NCP), so it can be utilized in large real power systems. A numerical example is carried out on IEEE Reliability Test System (RTS), which will show how the EEL can predict the current system bottleneck under future operational conditions and how to use EEL as one of planning objectives to determine future optimal plans. A well-known simulation method, Monte Carlo simulation, is employed to achieve the probabilistic characteristic of electricity market and Genetic Algorithms (GAs) is used as a multi-objective optimization tool.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A large number of models have been derived from the two-parameter Weibull distribution and are referred to as Weibull models. They exhibit a wide range of shapes for the density and hazard functions, which makes them suitable for modelling complex failure data sets. The WPP and IWPP plot allows one to determine in a systematic manner if one or more of these models are suitable for modelling a given data set. This paper deals with this topic.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Despite many successes of conventional DNA sequencing methods, some DNAs remain difficult or impossible to sequence. Unsequenceable regions occur in the genomes of many biologically important organisms, including the human genome. Such regions range in length from tens to millions of bases, and may contain valuable information such as the sequences of important genes. The authors have recently developed a technique that renders a wide range of problematic DNAs amenable to sequencing. The technique is known as sequence analysis via mutagenesis (SAM). This paper presents a number of algorithms for analysing and interpreting data generated by this technique.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The BR algorithm is a novel and efficient method to find all eigenvalues of upper Hessenberg matrices and has never been applied to eigenanalysis for power system small signal stability. This paper analyzes differences between the BR and the QR algorithms with performance comparison in terms of CPU time based on stopping criteria and storage requirement. The BR algorithm utilizes accelerating strategies to improve its performance when computing eigenvalues of narrowly banded, nearly tridiagonal upper Hessenberg matrices. These strategies significantly reduce the computation time at a reasonable level of precision. Compared with the QR algorithm, the BR algorithm requires fewer iteration steps and less storage space without depriving of appropriate precision in solving eigenvalue problems of large-scale power systems. Numerical examples demonstrate the efficiency of the BR algorithm in pursuing eigenanalysis tasks of 39-, 68-, 115-, 300-, and 600-bus systems. Experiment results suggest that the BR algorithm is a more efficient algorithm for large-scale power system small signal stability eigenanalysis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Algorithms for explicit integration of structural dynamics problems with multiple time steps (subcycling) are investigated. Only one such algorithm, due to Smolinski and Sleith has proved to be stable in a classical sense. A simplified version of this algorithm that retains its stability is presented. However, as with the original version, it can be shown to sacrifice accuracy to achieve stability. Another algorithm in use is shown to be only statistically stable, in that a probability of stability can be assigned if appropriate time step limits are observed. This probability improves rapidly with the number of degrees of freedom in a finite element model. The stability problems are shown to be a property of the central difference method itself, which is modified to give the subcycling algorithm. A related problem is shown to arise when a constraint equation in time is introduced into a time-continuous space-time finite element model. (C) 1998 Elsevier Science S.A.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present investigation assessed the reliability and validity of the scores of a subjective measure of desired aspirations and a behavioral measure of enacted aspirations. A sample of 5,655 employees was randomly split into two halves. Principal components analysis on Sample 1, followed by confirmatory factor analysis on Sample 2, confirmed the desired and enacted scales as distinct but related measures of managerial aspirations. The desired and enacted scales had satisfactory levels of internal consistency and temporal stability over a 1-year period. Relationships between the measures of desired and enacted managerial aspirations and both attitudinal and behavioral criteria, measured concurrently and 1 year later, provided preliminary support for convergent and discriminant validity for our sample. Desired aspirations demonstrated stronger validity than enacted aspirations. Although further examination of the psychometric properties of the scales is warranted, the present findings provide promising support for their validity and reliability for our sample.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Extended gcd calculation has a long history and plays an important role in computational number theory and linear algebra. Recent results have shown that finding optimal multipliers in extended gcd calculations is difficult. We present an algorithm which uses lattice basis reduction to produce small integer multipliers x(1), ..., x(m) for the equation s = gcd (s(1), ..., s(m)) = x(1)s(1) + ... + x(m)s(m), where s1, ... , s(m) are given integers. The method generalises to produce small unimodular transformation matrices for computing the Hermite normal form of an integer matrix.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Las Campanas Observatory and Anglo-Australian Telescope Rich Cluster Survey (LARCS) is a panoramic imaging and spectroscopic survey of an X-ray luminosity-selected sample of 21 clusters of galaxies at 0.07 < z < 0.16. Charge-coupled device (CCD) imaging was obtained in B and R of typically 2 degrees wide regions centred on the 21 clusters, and the galaxy sample selected from the imaging is being used for an on-going spectroscopic survey of the clusters with the 2dF spectrograph on the Anglo-Australian Telescope. This paper presents the reduction of the imaging data and the photometric analysis used in the survey. Based on an overlapping area of 12.3 deg(2) we compare the CCD-based LARCS catalogue with the photographic-based galaxy catalogue used for the input to the 2dF Galaxy Redshift Survey (2dFGRS) from the APM, to the completeness of the GRS/APM catalogue, b(J) = 19.45. This comparison confirms the reliability of the photometry across our mosaics and between the clusters in our survey. This comparison also provides useful information concerning the properties of the GRS/APM. The stellar contamination in the GRS/APM galaxy catalogue is confirmed as around 5-10 per cent, as originally estimated. However, using the superior sensitivity and spatial resolution in the LARCS survey evidence is found for four distinct populations of galaxies that are systematically omitted from the GRS/APM catalogue. The characteristics of the 'missing' galaxy populations are described, reasons for their absence examined and the impact they will have on the conclusions drawn from the 2dF Galaxy Redshift Survey are discussed.