18 resultados para RELIABILITY ANALYSIS
em CentAUR: Central Archive University of Reading - UK
Resumo:
Reliability analysis of probabilistic forecasts, in particular through the rank histogram or Talagrand diagram, is revisited. Two shortcomings are pointed out: Firstly, a uniform rank histogram is but a necessary condition for reliability. Secondly, if the forecast is assumed to be reliable, an indication is needed how far a histogram is expected to deviate from uniformity merely due to randomness. Concerning the first shortcoming, it is suggested that forecasts be grouped or stratified along suitable criteria, and that reliability is analyzed individually for each forecast stratum. A reliable forecast should have uniform histograms for all individual forecast strata, not only for all forecasts as a whole. As to the second shortcoming, instead of the observed frequencies, the probability of the observed frequency is plotted, providing and indication of the likelihood of the result under the hypothesis that the forecast is reliable. Furthermore, a Goodness-Of-Fit statistic is discussed which is essentially the reliability term of the Ignorance score. The discussed tools are applied to medium range forecasts for 2 m-temperature anomalies at several locations and lead times. The forecasts are stratified along the expected ranked probability score. Those forecasts which feature a high expected score turn out to be particularly unreliable.
Resumo:
Purpose – The purpose of this research is to show that reliability analysis and its implementation will lead to an improved whole life performance of the building systems, and hence their life cycle costs (LCC). Design/methodology/approach – This paper analyses reliability impacts on the whole life cycle of building systems, and reviews the up-to-date approaches adopted in UK construction, based on questionnaires designed to investigate the use of reliability within the industry. Findings – Approaches to reliability design and maintainability design have been introduced from the operating environment level, system structural level and component level, and a scheduled maintenance logic tree is modified based on the model developed by Pride. Different stages of the whole life cycle of building services systems, reliability-associated factors should be considered to ensure the system's whole life performance. It is suggested that data analysis should be applied in reliability design, maintainability design, and maintenance policy development. Originality/value – The paper presents important factors in different stages of the whole life cycle of the systems, and reliability and maintainability design approaches which can be helpful for building services system designers. The survey from the questionnaires provides the designers with understanding of key impacting factors.
Resumo:
This paper investigates the psychometric properties of Vigneron and Johnson's Brand Luxury Index scale. The authors developed the scale using data collected from a student sample in Australia. To validate the scale, the study reported in this paper uses data collected from Taiwanese luxury consumers. The scale was initially subjected to reliability analysis yielding low α values for two of its five proposed dimensions. Exploratory and confirmatory factors analyses were subsequently performed to examine the dimensionality of brand luxury. Discriminant and convergent validity tests highlight the need for further research into the dimensionality of the construct. Although the scale represents a good initial contribution to understanding brand luxury, in view of consumers' emerging shopping patterns, further investigation is warranted to establish the psychometric properties of the scale and its equivalence across cultures.
Resumo:
We consider the linear equality-constrained least squares problem (LSE) of minimizing ${\|c - Gx\|}_2 $, subject to the constraint $Ex = p$. A preconditioned conjugate gradient method is applied to the Kuhn–Tucker equations associated with the LSE problem. We show that our method is well suited for structural optimization problems in reliability analysis and optimal design. Numerical tests are performed on an Alliant FX/8 multiprocessor and a Cray-X-MP using some practical structural analysis data.
Resumo:
Prior literature showed that Felder and Silverman learning styles model (FSLSM) was widely adopted to cater to individual styles of learners whether in traditional or Technology Enhanced Learning (TEL). In order to infer this model, the Index of Learning Styles (ILS) instrument was proposed. This research aims to analyse the soundness of this instrument in an Arabic sample. Data were integrated from different courses and years. A total of 259 engineering students participated voluntarily in the study. The reliability was analysed by applying internal construct reliability, inter-scale correlation, and total item correlation. The construct validity was also considered by running factor analysis. The overall results indicated that the reliability and validity of perception and input dimensions were moderately supported, whereas processing and understanding dimensions showed low internal-construct consistency and their items were weakly loaded in the associated constructs. Generally, the instrument needs further effort to improve its soundness. However, considering the consistency of the produced results of engineering students irrespective of cross-cultural differences, it can be adopted to diagnose learning styles.
Resumo:
Sensitivity, specificity, and reproducibility are vital to interpret neuroscientific results from functional magnetic resonance imaging (fMRI) experiments. Here we examine the scan–rescan reliability of the percent signal change (PSC) and parameters estimated using Dynamic Causal Modeling (DCM) in scans taken in the same scan session, less than 5 min apart. We find fair to good reliability of PSC in regions that are involved with the task, and fair to excellent reliability with DCM. Also, the DCM analysis uncovers group differences that were not present in the analysis of PSC, which implies that DCM may be more sensitive to the nuances of signal changes in fMRI data.
Resumo:
The purpose of this study was to apply and compare two time-domain analysis procedures in the determination of oxygen uptake (VO2) kinetics in response to a pseudorandom binary sequence (PRBS) exercise test. PRBS exercise tests have typically been analysed in the frequency domain. However, the complex interpretation of frequency responses may have limited the application of this procedure in both sporting and clinical contexts, where a single time measurement would facilitate subject comparison. The relative potential of both a mean response time (MRT) and a peak cross-correlation time (PCCT) was investigated. This study was divided into two parts: a test-retest reliability study (part A), in which 10 healthy male subjects completed two identical PRBS exercise tests, and a comparison of the VO2 kinetics of 12 elite endurance runners (ER) and 12 elite sprinters (SR; part B). In part A, 95% limits of agreement were calculated for comparison between MRT and PCCT. The results of part A showed no significant difference between test and retest as assessed by MRT [mean (SD) 42.2 (4.2) s and 43.8 (6.9) s] or by PCCT [21.8 (3.7) s and 22.7 (4.5) s]. Measurement error (%) was lower for MRT in comparison with PCCT (16% and 25%, respectively). In part B of the study, the VO2 kinetics of ER were significantly faster than those of SR, as assessed by MRT [33.4 (3.4) s and 39.9 (7.1) s, respectively; P<0.01] and PCCT [20.9 (3.8) s and 24.8 (4.5) s; P < 0.05]. It is possible that either analysis procedure could provide a single test measurement Of VO2 kinetics; however, the greater reliability of the MRT data suggests that this method has more potential for development in the assessment Of VO2 kinetics by PRBS exercise testing.
Resumo:
This paper presents preliminary results from an assessment of the barriers to adaptation to water supply shortage in a case study catchment in south east England with multiple supply companies. The investigation applies a conceptual framework, which distinguishes between generic barriers affecting the ability of supply companies to make adaptation decisions, and specific barriers to the implementation of each option. The preliminary analysis suggests that whilst there is a widespread awareness of the challenge of climate change, and a conceptual understanding of the need for adaptation, some of the generic barriers that will affect detailed evaluations and actual adaptation decisions have yet to be approached. The analysis also shows that different individual adaptation options are assessed differently by different stakeholders, and that there are differences in the barriers to adoption between supply-side and demand-side measures. First, however, the paper develops the general conceptual framework for the characterisation of the barriers to adaptation used in the study.
Resumo:
Two different ways of performing low-energy electron diffraction (LEED) structure determinations for the p(2 x 2) structure of oxygen on Ni {111} are compared: a conventional LEED-IV structure analysis using integer and fractional-order IV-curves collected at normal incidence and an analysis using only integer-order IV-curves collected at three different angles of incidence. A clear discrimination between different adsorption sites can be achieved by the latter approach as well as the first and the best fit structures of both analyses are within each other's error bars (all less than 0.1 angstrom). The conventional analysis is more sensitive to the adsorbate coordinates and lateral parameters of the substrate atoms whereas the integer-order-based analysis is more sensitive to the vertical coordinates of substrate atoms. Adsorbate-related contributions to the intensities of integer-order diffraction spots are independent of the state of long-range order in the adsorbate layer. These results show, therefore, that for lattice-gas disordered adsorbate layers, for which only integer-order spots are observed, similar accuracy and reliability can be achieved as for ordered adsorbate layers, provided the data set is large enough.
Resumo:
Sensitive methods that are currently used to monitor proteolysis by plasmin in milk are limited due to 7 their high cost and lack of standardisation for quality assurance in the various dairy laboratories. In 8 this study, four methods, trinitrobenzene sulphonic acid (TNBS), reverse phase high pressure liquid 9 chromatography (RP-HPLC), gel electrophoresis and fluorescamine, were selected to assess their 10 suitability for the detection of proteolysis in milk by plasmin. Commercial UHT milk was incubated 11 with plasmin at 37 °C for one week. Clarification was achieved by isoelectric precipitation (pH 4·6 12 soluble extracts)or 6% (final concentration) trichloroacetic acid (TCA). The pH 4·6 and 6% TCA 13 soluble extracts of milk showed high correlations (R2 > 0·93) by the TNBS, fluorescamine and 14 RP-HPLC methods, confirming increased proteolysis during storage. For gel electrophoresis,15 extensive proteolysis was confirmed by the disappearance of α- and β-casein bands on the seventh 16 day, which was more evident in the highest plasmin concentration. This was accompanied by the 17 appearance of α- and β-casein proteolysis products with higher intensities than on previous days, 18 implying that more products had been formed as a result of casein breakdown. The fluorescamine 19 method had a lower detection limit compared with the other methods, whereas gel electrophoresis 20 was the best qualitative method for monitoring β-casein proteolysis products. Although HPLC was the 21 most sensitive, the TNBS method is recommended for use in routine laboratory analysis on the basis 22 of its accuracy, reliability and simplicity.
Resumo:
Modern transaction cost economics (TCE) thinking has developed into a key intellectual foundation of international business (IB) research, but the Williamsonian version has faced substantial criticism for adopting the behavioral assumption of opportunism. In this paper we assess both the opportunism concept and existing alternatives such as trust within the context of IB research, especially work on multinational enterprise (MNE) governance. Case analyses of nine global MNEs illustrate an alternative to the opportunism assumption that captures more fully the mechanisms underlying failed commitments inside the MNE. As a substitute for the often-criticized assumption of opportunism, we propose the envelope concept of bounded reliability (BRel), an assumption that represents more accurately and more completely the reasons for failed commitments, without invalidating the other critical assumption in conventional TCE (and internalization theory) thinking, namely the widely accepted envelope concept of bounded rationality (BRat). Bounded reliability as an envelope concept includes two main components, within the context of global MNE management: opportunism as intentional deceit, and benevolent preference reversal. The implications for IB research of adopting the bounded reliability concept are far reaching, as this concept may increase the legitimacy of comparative institutional analysis in the social sciences.