955 resultados para automatic test and measurement
Resumo:
The aim was to investigate inter-tester and intra-tester reliability and parallel reliability between a visual assessment method and a method using a pachymeter for locating the mid-point of the patella in determining the medial/lateral patella orientation. Fifteen asymptomatic subjects were assessed and the mid-point of the patella was determined by both methods on two separate occasions two weeks apart. Inter-tester reliability was obtained by ANOVA and by intraclass correlation coefficient (ICC); intra-tester reliability was obtained by a paired t-test and ICC; and parallel reliability was obtained by Pearson`s Correlation and ICC, for the measurement on the first and second evaluations. There was acceptable inter-tester agreement (p = 0.490) and reliability for the visual inspection (ICC = 0.747) and for the pachymeter (ICC = 0.716) at the second evaluation. The inter-tester reliability in the first evaluation was unacceptable (visual ICC = 0.604; pachymeter ICC = 0.612). Although there was statistical similarity between measurements for the first and second evaluations for all testers, intra-tester reliability was not acceptable for both methods: visual (examiner 1 ICC = 0.175; examiner 2 ICC = 0.189; examiner 3 ICC = 0.155) and pachymeter (examiner 1 ICC = 0.214; examiner 2 ICC = 0.246; examiner 3 ICC = 0.069). Parallel reliability gave a perfect correlation at the first evaluation (r=0.828; p<0.001) and at the second (r=0.756; p<0.001) and reliability was between acceptable and very good (ICC = [0.748-0.813]). Both visual and pachymeter methods provide reliable and similar medial/lateral patella orientation and are reliable between different examiners, but the results between the two assessments at 2 weeks` interval demonstrated an unacceptable reliability. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Purpose: Recently morphometric measurements of the ascending aorta have been done with ECG-gated MDCT to help the development of future endovascular therapies (TCT) [1]. However, the variability of these measurements remains unknown. It will be interesting to know the impact of CAD (computer aided diagnosis) with automated segmentation of the vessel and automatic measurements of diameter on the management of ascending aorta aneurysms. Methods and Materials: Thirty patients referred for ECG-gated CT thoracic angiography (64-row CT scanner) were evaluated. Measurements of the maximum and minimum ascending aorta diameters were obtained automatically with a commercially available CAD and semi-manually by two observers separately. The CAD algorithms segment the iv-enhanced lumen of the ascending aorta into perpendicular planes along the centreline. The CAD then determines the largest and the smallest diameters. Both observers repeated the automatic measurements and the semimanual measurements during a different session at least one month after the first measurements. The Bland and Altman method was used to study the inter/intraobserver variability. A Wilcoxon signed-rank test was also used to analyse differences between observers. Results: Interobserver variability for semi-manual measurements between the first and second observers was between 1.2 to 1.0 mm for maximal and minimal diameter, respectively. Intraobserver variability of each observer ranged from 0.8 to 1.2 mm, the lowest variability being produced by the more experienced observer. CAD variability could be as low as 0.3 mm, showing that it can perform better than human observers. However, when used in nonoptimal conditions (streak artefacts from contrast in the superior vena cava or weak lumen enhancement), CAD has a variability that can be as high as 0.9 mm, reaching variability of semi-manual measurements. Furthermore, there were significant differences between both observers for maximal and minimal diameter measurements (p<0.001). There was also a significant difference between the first observer and CAD for maximal diameter measurements with the former underestimating the diameter compared to the latter (p<0.001). As for minimal diameters, they were higher when measured by the second observer than when measured by CAD (p<0.001). Neither the difference of mean minimal diameter between the first observer and CAD nor the difference of mean maximal diameter between the second observer and CAD was significant (p=0.20 and 0.06, respectively). Conclusion: CAD algorithms can lessen the variability of diameter measurements in the follow-up of ascending aorta aneurysms. Nevertheless, in non-optimal conditions, it may be necessary to correct manually the measurements. Improvements of the algorithms will help to avoid such a situation.
Resumo:
The Iowa State Highway Commission purchased a Conrad automatic freeze and thaw machine and placed it in operation during October 1961. There were a few problems, but considering, the many electrical and mechanical devices used in the automatic system it has always functioned quite well. Rapid freezing and thawing of 4"x4"xl8" concrete beams has been conducted primarily in accordance with ASTM C-29l (now ASTM C-666 procedure B) at the rate of one beam per day. Over 4000 beams have been tested since 1961, with determination of the resulting durability factors. Various methods of curing were used and a standard 90 day moist cure was selected. This cure seemed to yield durability factors that correlated very well with ratings of coarse aggregates based on service records. Some concrete beams had been made using the same coarse aggregate and the durability factors compared relatively well with previous tests. Durability factors seemed to yield reasonable results until large variations in durability factors were noted from beams of identical concrete mix proportions in research projects R-234 and R-247. This then presents the question "How reliable is the durability as determined by ASTM C-666?" This question became increasingly more important when a specification requiring a minimum durability factor for P.C. concrete made from coarse aggregates was incorporated into the 1972 Standard Specification for coarse aggregates for concrete.
Resumo:
Several methods are used to estimate anaerobic threshold (AT) during exercise. The aim of the present study was to compare AT obtained by a graphic visual method for the estimate of ventilatory and metabolic variables (gold standard), to a bi-segmental linear regression mathematical model of Hinkley's algorithm applied to heart rate (HR) and carbon dioxide output (VCO2) data. Thirteen young (24 ± 2.63 years old) and 16 postmenopausal (57 ± 4.79 years old) healthy and sedentary women were submitted to a continuous ergospirometric incremental test on an electromagnetic braking cycloergometer with 10 to 20 W/min increases until physical exhaustion. The ventilatory variables were recorded breath-to-breath and HR was obtained beat-to-beat over real time. Data were analyzed by the nonparametric Friedman test and Spearman correlation test with the level of significance set at 5%. Power output (W), HR (bpm), oxygen uptake (VO2; mL kg-1 min-1), VO2 (mL/min), VCO2 (mL/min), and minute ventilation (VE; L/min) data observed at the AT level were similar for both methods and groups studied (P > 0.05). The VO2 (mL kg-1 min-1) data showed significant correlation (P < 0.05) between the gold standard method and the mathematical model when applied to HR (r s = 0.75) and VCO2 (r s = 0.78) data for the subjects as a whole (N = 29). The proposed mathematical method for the detection of changes in response patterns of VCO2 and HR was adequate and promising for AT detection in young and middle-aged women, representing a semi-automatic, non-invasive and objective AT measurement.
Resumo:
Automatic indexing and retrieval of digital data poses major challenges. The main problem arises from the ever increasing mass of digital media and the lack of efficient methods for indexing and retrieval of such data based on the semantic content rather than keywords. To enable intelligent web interactions, or even web filtering, we need to be capable of interpreting the information base in an intelligent manner. For a number of years research has been ongoing in the field of ontological engineering with the aim of using ontologies to add such (meta) knowledge to information. In this paper, we describe the architecture of a system (Dynamic REtrieval Analysis and semantic metadata Management (DREAM)) designed to automatically and intelligently index huge repositories of special effects video clips, based on their semantic content, using a network of scalable ontologies to enable intelligent retrieval. The DREAM Demonstrator has been evaluated as deployed in the film post-production phase to support the process of storage, indexing and retrieval of large data sets of special effects video clips as an exemplar application domain. This paper provides its performance and usability results and highlights the scope for future enhancements of the DREAM architecture which has proven successful in its first and possibly most challenging proving ground, namely film production, where it is already in routine use within our test bed Partners' creative processes. (C) 2009 Published by Elsevier B.V.
Resumo:
Since Ranzini suggested supplementing the SPT test with measurement of the torque required to turn the split spoon sampler after driving, many Brazilian engineers have been using this in the design of pile foundations. This paper presents a study of the rod length influence in the torque measurement. A theoretical study of material resistance considering torsion and bending in a thin wall tubular steel shaft was performed. It makes possible to conclude that the shearing tension caused by the proper weight represents less than 1% of the shearing tension caused by the turning moment. In addition, an experimental study was done with electric torquemeters fixed in a horizontal rod system. The tests were being carried out to analyze rods of one meter to twenty meters in length and the measurements were collected at the ends of each rod length verifying the efficiency data. As a result, it is possible to verify that the torque difference through rod length is lower than minimum scales of mechanical torquemeters that are used on practical engineering. Also a fact to be considered is a big torque loss for values under 20 N.m of applied torque. This way, the SPT-T is not adequate to low consistency soil. Copyright ASCE 2007.
Resumo:
In most studies on beef cattle longevity, only the cows reaching a given number of calvings by a specific age are considered in the analyses. With the aim of evaluating all cows with productive life in herds, taking into consideration the different forms of management on each farm, it was proposed to measure cow longevity from age at last calving (ALC), that is, the most recent calving registered in the files. The objective was to characterize this trait in order to study the longevity of Nellore cattle, using the Kaplan-Meier estimators and the Cox model. The covariables and class effects considered in the models were age at first calving (AFC), year and season of birth of the cow and farm. The variable studied (ALC) was classified as presenting complete information (uncensored = 1) or incomplete information (censored = 0), using the criterion of the difference between the date of each cow's last calving and the date of the latest calving at each farm. If this difference was >36 months, the cow was considered to have failed. If not, this cow was censored, thus indicating that future calving remained possible for this cow. The records of 11 791 animals from 22 farms within the Nellore Breed Genetic Improvement Program ('Nellore Brazil') were used. In the estimation process using the Kaplan-Meier model, the variable of AFC was classified into three age groups. In individual analyses, the log-rank test and the Wilcoxon test in the Kaplan-Meier model showed that all covariables and class effects had significant effects (P < 0.05) on ALC. In the analysis considering all covariables and class effects, using the Wald test in the Cox model, only the season of birth of the cow was not significant for ALC (P > 0.05). This analysis indicated that each month added to AFC diminished the risk of the cow's failure in the herd by 2%. Nonetheless, this does not imply that animals with younger AFC had less profitability. Cows with greater numbers of calvings were more precocious than those with fewer calvings. Copyright © The Animal Consortium 2012.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
The present research examined the prediction of school students' grades in an upcoming math test via their minimal grade goals (i.e., the minimum grade in an upcoming test one would be satisfied with). Due to its significance for initiating and maintaining goal-directed behavior, self-control capacity was expected to moderate the relation between students' minimal grade goals and their actual grades. Self-control capacity was defined as the dispositional capacity to override or alter one's dominant response tendencies. Prior to a scheduled math test, 172 vocational track students indicated their minimal grade goal for the test and completed a measure of self-control capacity. The test grade was assessed at a second time of measurement. As expected, minimal grade goals more strongly predicted the actual test grades the higher the students' self-control capacity. Implications can be seen in terms of optimizing the prediction and advancement of academic performance.
Resumo:
Mistreatment and self-neglect significantly increase the risk of dying in older adults. It is estimated that 1 to 2 million older adults experience elder mistreatment and self-neglect every year in the United States. Currently, there are no elder mistreatment and self-neglect assessment tools with construct validity and measurement invariance testing and no studies have sought to identify underlying latent classes of elder self-neglect that may have differential mortality rates. Using data from 11,280 adults with Texas APS substantiated elder mistreatment and self-neglect 3 studies were conducted to: (1) test the construct validity and (2) the measurement invariance across gender and ethnicity of the Texas Adult Protective Services (APS) Client Assessment and Risk Evaluation (CARE) tool and (3) identify latent classes associated with elder self-neglect. Study 1 confirmed the construct validity of the CARE tool following adjustments to the initial hypothesized CARE tool. This resulted in the deletion of 14 assessment items and a final assessment with 5 original factors and 43 items. Cross-validation for this model was achieved. Study 2 provided empirical evidence for factor loading and item-threshold invariance of the CARE tool across gender and between African-Americans and Caucasians. The financial status domain of the CARE tool did not function properly for Hispanics and thus, had to be deleted. Subsequent analyses showed factor loading and item-threshold invariance across all 3 ethnic groups with the exception of some residual errors. Study 3 identified 4-latent classes associated with elder self-neglect behaviors which included individuals with evidence of problems in the areas of (1) their environment, (2) physical and medical status, (3) multiple domains and (4) finances. Overall, these studies provide evidence supporting the use of APS CARE tool for providing unbiased and valid investigations of mistreatment and neglect in older adults with different demographic characteristics. Furthermore, the findings support the underlying notion that elder self-neglect may not only occur along a continuum, but that differential types may exist. All of which, have very important potential implications for social and health services distributed to vulnerable mistreated and neglected older adults.^
Resumo:
cont. VI. The application of standard measurements to school administration. [By] D.C. Bliss. VII. A half-year's progress in the achievement of one school system. A. The progress as measured by the Thorndike visual vocabulary test. B. The progress as measured by the Courtis tests, series B. [By] H.G. Childs. VIII. Courtis tests in arithmetic: value to superintendents and teacher. [By] S.A. Courtis. IX. Use of standard tests at Salt Lake City, Utah. [By] E. P. Cubberley. X. Reading. [By] C.H. Judd. XI. Studies by the Bureau of research and efficiency of Kansas City, Mo. [By] George Melcher. XII. The effects of efficiency tests in reading on a city school system. [By] E.E. Oberholtzer. XIII. Investigation of spelling in the schools of Oakland, Cal. [By] J.B. Sears. XIV. Standard tests as aids in the classification and promotion of pupils. [By] Daniel Starch. XV. The use of mental tests in the school. [By] G.M. Whipple.
Resumo:
Photocopy.
Resumo:
This paper presents MRI measurements of a novel semi solid MR contrast agent to pressure. The agent is comprised of potassium chloride cross linked carageenan gum at a concentration of 2% w/v, with micron size lipid coated bubbles of air at a concentration of 3% v/v. The choice for an optimum suspending medium, the methods of production and the preliminary MRI results are presented herein. The carageenan gum is shown to be ideally elastic for compressions relating to volume changes less than 15%, in contrast to the inelastic gellan gum also tested. Although slightly lower than that of gellan gum, carageenan has a water diffusion coefficient of 1.72×10-9 m2.s-1 indicating its suitability to this purpose. RARE imaging is performed whilst simultaneously compressing test and control samples and a maximum sensitivity of 1.6% MR signal change per % volume change is found which is shown to be independent of proton density variations due to the presence of microbubbles and compression. This contrast agent could prove useful for numerous applications, and particularly in chemical engineering. More generally the method allows the user to non-invasively image with MRI any process that causes, within the solid, local changes either in bubble size or bubble shape. © 2008 American Institute of Physics.
Resumo:
As the largest source of dimensional measurement uncertainty, addressing the challenges of thermal variation is vital to ensure product and equipment integrity in the factories of the future. While it is possible to closely control room temperature, this is often not practical or economical to realise in all cases where inspection is required. This article reviews recent progress and trends in seven key commercially available industrial temperature measurement sensor technologies primarily in the range of 0 °C–50 °C for invasive, semi-invasive and non-invasive measurement. These sensors will ultimately be used to measure and model thermal variation in the assembly, test and integration environment. The intended applications for these technologies are presented alongside some consideration of measurement uncertainty requirements with regard to the thermal expansion of common materials. Research priorities are identified and discussed for each of the technologies as well as temperature measurement at large. Future developments are briefly discussed to provide some insight into which direction the development and application of temperature measurement technologies are likely to head.
Resumo:
This research has explored the relationship between system test complexity and tacit knowledge. It is proposed as part of this thesis, that the process of system testing (comprising of test planning, test development, test execution, test fault analysis, test measurement, and case management), is directly affected by both complexity associated with the system under test, and also by other sources of complexity, independent of the system under test, but related to the wider process of system testing. While a certain amount of knowledge related to the system under test is inherent, tacit in nature, and therefore difficult to make explicit, it has been found that a significant amount of knowledge relating to these other sources of complexity, can indeed be made explicit. While the importance of explicit knowledge has been reinforced by this research, there has been a lack of evidence to suggest that the availability of tacit knowledge to a test team is of any less importance to the process of system testing, when operating in a traditional software development environment. The sentiment was commonly expressed by participants, that even though a considerable amount of explicit knowledge relating to the system is freely available, that a good deal of knowledge relating to the system under test, which is demanded for effective system testing, is actually tacit in nature (approximately 60% of participants operating in a traditional development environment, and 60% of participants operating in an agile development environment, expressed similar sentiments). To cater for the availability of tacit knowledge relating to the system under test, and indeed, both explicit and tacit knowledge required by system testing in general, an appropriate knowledge management structure needs to be in place. This would appear to be required, irrespective of the employed development methodology.