979 resultados para reliability testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We develop a new autoregressive conditional process to capture both the changes and the persistency of the intraday seasonal (U-shape) pattern of volatility in essay 1. Unlike other procedures, this approach allows for the intraday volatility pattern to change over time without the filtering process injecting a spurious pattern of noise into the filtered series. We show that prior deterministic filtering procedures are special cases of the autoregressive conditional filtering process presented here. Lagrange multiplier tests prove that the stochastic seasonal variance component is statistically significant. Specification tests using the correlogram and cross-spectral analyses prove the reliability of the autoregressive conditional filtering process. In essay 2 we develop a new methodology to decompose return variance in order to examine the informativeness embedded in the return series. The variance is decomposed into the information arrival component and the noise factor component. This decomposition methodology differs from previous studies in that both the informational variance and the noise variance are time-varying. Furthermore, the covariance of the informational component and the noisy component is no longer restricted to be zero. The resultant measure of price informativeness is defined as the informational variance divided by the total variance of the returns. The noisy rational expectations model predicts that uninformed traders react to price changes more than informed traders, since uninformed traders cannot distinguish between price changes caused by information arrivals and price changes caused by noise. This hypothesis is tested in essay 3 using intraday data with the intraday seasonal volatility component removed, as based on the procedure in the first essay. The resultant seasonally adjusted variance series is decomposed into components caused by unexpected information arrivals and by noise in order to examine informativeness.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The goal of this thesis was to develop, construct, and validate the Perceived Economic Burden scale to quantitatively measure the burden associated with a subtype Arrhythmogenic Right Ventricular Cardiomyopathy (ARVC) in families from the island of Newfoundland. An original 76 item self-administered survey was designed using content from existing literature as well as themes from qualitative research conducted by our team and distributed to individuals of families known to be at risk for the disease. A response rate of 37.2% (n = 64) was achieved between December 2013 and May 2014. Tests for data quality, Likert scale assumptions and scale reliability were conducted and provided preliminary evidence of the psychometric properties of the final constructed perceived economic burden of ARVC scale comprising 62 items in five sections. Findings indicated that being an affected male was a significant predictor of increased perceived economic burden in the majority of economic burden measures. Affected males also reported an increased likelihood of going on disability and difficulty obtaining insurance. Affected females also had an increased perceived financial burden. Preliminary results suggest that a perceived economic burden exists within the ARVC population in Newfoundland.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: It is important to assess the clinical competence of nursing students to gauge their educational needs. Competence can be measured by self-assessment tools; however, Anema and McCoy (2010) contend that currently available measures should be further psychometrically tested.
Aim: To test the psychometric properties of Nursing Competencies Questionnaire (NCQ) and Self-Efficacy in Clinical Performance (SECP) clinical competence scales.
Method: A non-randomly selected sample of n=248 2nd year nursing students completed NCQ, SECP and demographic questionnaires (June and September 2013). Mokken Scaling Analysis (MSA) was used to investigate structural validity and scale properties; convergent and discriminant validity and reliability were also tested for each scale.
Results: MSA analysis identified that the NCQ is a unidimensional scale with strong scale scalability coefficients Hs =0.581; but limited item rankability HT =0.367. The SECP scale MSA suggested that the scale could be potentially split into two unidimensional scales (SECP28 and SECP7), each with good/reasonable scalablity psychometric properties as summed scales but negligible/very limited scale rankability (SECP28: Hs = 0.55, HT=0.211; SECP7: Hs = 0.61, HT=0.049). Analysis of between cohort differences and NCQ/SECP scores produced evidence of discriminant and convergent validity; good internal reliability was also found: NCQ α = 0.93, SECP28 α = 0.96 and SECP7 α=0.89.

Discussion: In line with previous research further evidence of the NCQ’s reliability and validity was demonstrated. However, as the SECP findings are new and the sample small with reference to Straat and colleagues (2014), the SECP results should be interpreted with caution and verified on a second sample.
Conclusions: Measurement of perceived self-competence could start early in a nursing programme to support students’ development of clinical competence. Further testing of the SECP scale with larger nursing student samples from different programme years is indicated.

References:
Anema, M., G and McCoy, JK. (2010) Competency-Based Nursing Education: Guide to Achieving Outstanding Learner Outcomes. New York: Springer.
Straat, JH., van der Ark, LA and Sijtsma, K. (2014) Minimum Sample Size Requirements for Mokken Scale Analysis Educational and Psychological Measurement 74 (5), 809-822.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the continued miniaturization and increasing performance of electronic devices, new technical challenges have arisen. One such issue is delamination occurring at critical interfaces inside the device. This major reliability issue can occur during the manufacturing process or during normal use of the device. Proper evaluation of the adhesion strength of critical interfaces early in the product development cycle can help reduce reliability issues and time-to-market of the product. However, conventional adhesion strength testing is inherently limited in the face of package miniaturization, which brings about further technical challenges to quantify design integrity and reliability. Although there are many different interfaces in today's advanced electronic packages, they can be generalized into two main categories: 1) rigid to rigid connections with a thin flexible polymeric layer in between, or 2) a thin film membrane on a rigid structure. Knowing that every technique has its own advantages and disadvantages, multiple testing methods must be enhanced and developed to be able to accommodate all the interfaces encountered for emerging electronic packaging technologies. For evaluating the adhesion strength of high adhesion strength interfaces in thin multilayer structures a novel adhesion test configuration called “single cantilever adhesion test (SCAT)” is proposed and implemented for an epoxy molding compound (EMC) and photo solder resist (PSR) interface. The test method is then shown to be capable of comparing and selecting the stronger of two potential EMC/PSR material sets. Additionally, a theoretical approach for establishing the applicable testing domain for a four-point bending test method was presented. For evaluating polymeric films on rigid substrates, major testing challenges are encountered for reducing testing scatter and for factoring in the potentially degrading effect of environmental conditioning on the material properties of the film. An advanced blister test with predefined area test method was developed that considers an elasto-plastic analytical solution and implemented for a conformal coating used to prevent tin whisker growth. The advanced blister testing with predefined area test method was then extended by employing a numerical method for evaluating the adhesion strength when the polymer’s film properties are unknown.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation focuses on design challenges caused by secondary impacts to printed wiring assemblies (PWAs) within hand-held electronics due to accidental drop or impact loading. The continuing increase of functionality, miniaturization and affordability has resulted in a decrease in the size and weight of handheld electronic products. As a result, PWAs have become thinner and the clearances between surrounding structures have decreased. The resulting increase in flexibility of the PWAs in combination with the reduced clearances requires new design rules to minimize and survive possible internal collisions impacts between PWAs and surrounding structures. Such collisions are being termed ‘secondary impact’ in this study. The effect of secondary impact on board-level drop reliability of printed wiring boards (PWBs) assembled with MEMS microphone components, is investigated using a combination of testing, response and stress analysis, and damage modeling. The response analysis is conducted using a combination of numerical finite element modeling and simplified analytic models for additional parametric sensitivity studies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thin film adhesion often determines microelectronic device reliability and it is therefore essential to have experimental techniques that accurately and efficiently characterize it. Laser-induced delamination is a novel technique that uses laser-generated stress waves to load thin films at high strain rates and extract the fracture toughness of the film/substrate interface. The effectiveness of the technique in measuring the interface properties of metallic films has been documented in previous studies. The objective of the current effort is to model the effect of residual stresses on the dynamic delamination of thin films. Residual stresses can be high enough to affect the crack advance and the mode mixity of the delimitation event, and must therefore be adequately modeled to make accurate and repeatable predictions of fracture toughness. The equivalent axial force and bending moment generated by the residual stresses are included in a dynamic, nonlinear finite element model of the delaminating film, and the impact of residual stresses on the final extent of the interfacial crack, the relative contribution of shear failure, and the deformed shape of the delaminated film is studied in detail. Another objective of the study is to develop techniques to address issues related to the testing of polymeric films. These type of films adhere well to silicon and the resulting crack advance is often much smaller than for metallic films, making the extraction of the interface fracture toughness more difficult. The use of an inertial layer which enhances the amount of kinetic energy trapped in the film and thus the crack advance is examined. It is determined that the inertial layer does improve the crack advance, although in a relatively limited fashion. The high interface toughness of polymer films often causes the film to fail cohesively when the crack front leaves the weakly bonded region and enters the strong interface. The use of a tapered pre-crack region that provides a more gradual transition to the strong interface is examined. The tapered triangular pre-crack geometry is found to be effective in reducing the stresses induced thereby making it an attractive option. We conclude by studying the impact of modifying the pre-crack geometry to enable the testing of multiple polymer films.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is an increasing concern to reduce the cost and overheads during the development of reliable systems. Selective protection of most critical parts of the systems represents a viable solution to obtain a high level of reliability at a fraction of the cost. In particular to design a selective fault mitigation strategy for processor-based systems, it is mandatory to identify and prioritize the most vulnerable registers in the register file as best candidates to be protected (hardened). This paper presents an application-based metric to estimate the criticality of each register from the microprocessor register file in microprocessor-based systems. The proposed metric relies on the combination of three different criteria based on common features of executed applications. The applicability and accuracy of our proposal have been evaluated in a set of applications running in different microprocessors. Results show a significant improvement in accuracy compared to previous approaches and regardless of the underlying architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The thesis "COMPARATIVE ANALYSIS OF EFFICIENCY AND OPERATING CHARACTERISTICS OF AUTOMOTIVE POWERTRAIN ARCHITECTURES THROUGH CHASSIS DYNAMOMETER TESTING" was completed through a collaborative partnership between Michigan Technological University and Argonne National Laboratory under a contractual agreement titled "Advanced Vehicle Characterization at Argonne National Laboratory". The goal of this project was to investigate, understand and document the performance and operational strategy of several modern passenger vehicles of various architectures. The vehicles were chosen to represent several popular engine and transmission architectures and were instrumented to allow for data collection to facilitate comparative analysis. In order to ensure repeatability and reliability during testing, each vehicle was tested over a series of identical drive cycles in a controlled environment utilizing a vehicle chassis dynamometer. Where possible, instrumentation was preserved between vehicles to ensure robust data collection. The efficiency and fuel economy performance of the vehicles was studied. In addition, the powertrain utilization strategies, significant energy loss sources, tailpipe emissions, combustion characteristics, and cold start behavior were also explored in detail. It was concluded that each vehicle realizes different strengths and suffers from different limitations in the course of their attempts to maximize efficiency and fuel economy. In addition, it was observed that each vehicle regardless of architecture exhibits significant energy losses and difficulties in cold start operation that can be further improved with advancing technology. It is clear that advanced engine technologies and driveline technologies are complimentary aspects of vehicle design that must be utilized together for best efficiency improvements. Finally, it was concluded that advanced technology vehicles do not come without associated cost; the complexity of the powertrains and lifecycle costs must be considered to understand the full impact of advanced vehicle technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of the present study was to examine the reliability of tethered swimming in the evaluation of age group swimmers. The sample was composed of 8 male national level swimmers with at least 4 years of experience in competitive swimming. Each swimmer performed two 30 second maximal intensity tethered swimming tests, on separate days. Individual force-time curves were registered to assess maximum force, mean force and the mean impulse of force. Both consistency and reliability were very strong, with Cronbach’s Alpha values ranging from 0.970 to 0.995. All the applied metrics presented a very high agreement between tests, with the mean impulse of force presenting the highest. These results indicate that tethered swimming can be used to evaluate age group swimmers. Furthermore, better comprehension of the swimmers ability to effectively exert force in the water can be obtained using the impulse of force. Key words: swimming, training and testing, propulsive force, front crawl.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the field of vibration qualification testing, with the popular Random Control mode of shakers, the specimen is excited by random vibrations typically set in the form of a Power Spectral Density (PSD). The corresponding signals are stationary and Gaussian, i.e. featuring a normal distribution. Conversely, real-life excitations are frequently non-Gaussian, exhibiting high peaks and/or burst signals and/or deterministic harmonic components. The so-called kurtosis is a parameter often used to statistically describe the occurrence and significance of high peak values in a random process. Since the similarity between test input profiles and real-life excitations is fundamental for qualification test reliability, some methods of kurtosis-control can be implemented to synthesize realistic (non-Gaussian) input signals. Durability tests are performed to check the resistance of a component to vibration-based fatigue damage. A procedure to synthesize test excitations which starts from measured data and preserves both the damage potential and the characteristics of the reference signals is desirable. The Fatigue Damage Spectrum (FDS) is generally used to quantify the fatigue damage potential associated with the excitation. The signal synthesized for accelerated durability tests (i.e. with a limited duration) must feature the same FDS as the reference vibration computed for the component’s expected lifetime. Current standard procedures are efficient in synthesizing signals in the form of a PSD, but prove inaccurate if reference data are non-Gaussian. This work presents novel algorithms for the synthesis of accelerated durability test profiles with prescribed FDS and a non-Gaussian distribution. An experimental campaign is conducted to validate the algorithms, by testing their accuracy, robustness, and practical effectiveness. Moreover, an original procedure is proposed for the estimation of the fatigue damage potential, aiming to minimize the computational time. The research is thus supposed to improve both the effectiveness and the efficiency of excitation profile synthesis for accelerated durability tests.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

to assess the construct validity and reliability of the Pediatric Patient Classification Instrument. correlation study developed at a teaching hospital. The classification involved 227 patients, using the pediatric patient classification instrument. The construct validity was assessed through the factor analysis approach and reliability through internal consistency. the Exploratory Factor Analysis identified three constructs with 67.5% of variance explanation and, in the reliability assessment, the following Cronbach's alpha coefficients were found: 0.92 for the instrument as a whole; 0.88 for the Patient domain; 0.81 for the Family domain; 0.44 for the Therapeutic procedures domain. the instrument evidenced its construct validity and reliability, and these analyses indicate the feasibility of the instrument. The validation of the Pediatric Patient Classification Instrument still represents a challenge, due to its relevance for a closer look at pediatric nursing care and management. Further research should be considered to explore its dimensionality and content validity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Universidade Estadual de Campinas . Faculdade de Educação Física

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Secondary caries has been reported as the main reason for restoration replacement. The aim of this in vitro study was to evaluate the performance of different methods - visual inspection, laser fluorescence (DIAGNOdent), radiography and tactile examination - for secondary caries detection in primary molars restored with amalgam. Fifty-four primary molars were photographed and 73 suspect sites adjacent to amalgam restorations were selected. Two examiners evaluated independently these sites using all methods. Agreement between examiners was assessed by the Kappa test. To validate the methods, a caries-detector dye was used after restoration removal. The best cut-off points for the sample were found by a Receiver Operator Characteristic (ROC) analysis, and the area under the ROC curve (Az), and the sensitivity, specificity and accuracy of the methods were calculated for enamel (D2) and dentine (D3) thresholds. These parameters were found for each method and then compared by the McNemar test. The tactile examination and visual inspection presented the highest inter-examiner agreement for the D2 and D3 thresholds, respectively. The visual inspection also showed better performance than the other methods for both thresholds (Az = 0.861 and Az = 0.841, respectively). In conclusion, the visual inspection presented the best performance for detecting enamel and dentin secondary caries in primary teeth restored with amalgam.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

OBJECTIVE: This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI) and fluorescence camera (FC) to detect plaque. MATERIAL AND METHODS: Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. RESULTS: Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. CONCLUSION: The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque.