964 resultados para field test
Resumo:
In this paper, we described an efficient theoretical approach to determine the integral characteristics such as Mode Field Diameter (MFD) and V-parameter of the Weakly guiding waveguides. To test the described method we measured MFD for the commercially available step index single mode fibre with known parameters. The results of these measurements are presented for two different wavelengths. It is worth noting that the developed approach implies infinite cladding, thus care should be taken to avoid influence of finiteness of cladding when MFD compares to cladding diamete.
Resumo:
Mobile technologies have yet to be widely adopted by the Architectural, Engineering, and Construction (AEC) industry despite being one of the major growth areas in computing in recent years. This lack of uptake in the AEC industry is likely due, in large part, to the combination of small screen size and inappropriate interaction demands of current mobile technologies. This paper discusses the scope for multimodal interaction design with a specific focus on speech-based interaction to enhance the suitability of mobile technology use within the AEC industry by broadening the field data input capabilities of such technologies. To investigate the appropriateness of using multimodal technology for field data collection in the AEC industry, we have developed a prototype Multimodal Field Data Entry (MFDE) application. This application, which allows concrete testing technicians to record quality control data in the field, has been designed to support two different modalities of data input speech-based data entry and stylus-based data entry. To compare the effectiveness or usability of, and user preference for, the different input options, we have designed a comprehensive lab-based evaluation of the application. To appropriately reflect the anticipated context of use within the study design, careful consideration had to be given to the key elements of a construction site that would potentially influence a test technician's ability to use the input techniques. These considerations and the resultant evaluation design are discussed in detail in this paper.
Resumo:
CONCLUSIONS: The new HSA protocol used in the mfVEP testing can be applied to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patients. Using this protocol can provide information about focal visual field differences across the horizontal midline, which can be utilized to differentiate between glaucoma and normal subjects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. PURPOSE: Multifocal visual evoked potential (mfVEP) is a newly introduced method used for objective visual field assessment. Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard automated perimetry (SAP) visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. METHODS: Three groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey field analyzer (HFA) test 24-2 and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the hemifield sector analysis (HSA) protocol. Analysis of the HFA was done using the standard grading system. RESULTS: Analysis of mfVEP results showed that there was a statistically significant difference between the three groups in the mean signal to noise ratio (ANOVA test, p < 0.001 with a 95% confidence interval). The difference between superior and inferior hemispheres in all subjects were statistically significant in the glaucoma patient group in all 11 sectors (t-test, p < 0.001), partially significant in 5 / 11 (t-test, p < 0.01), and no statistical difference in most sectors of the normal group (1 / 11 sectors was significant, t-test, p < 0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86%, respectively, and for glaucoma suspect patients the values were 89% and 79%, respectively.
Resumo:
Several analysis protocols have been tested to identify early visual field losses in glaucoma patients using the mfVEP technique, some were successful in detection of field defects, which were comparable to the standard SAP visual field assessment, and others were not very informative and needed more adjustment and research work. In this study we implemented a novel analysis approach and evaluated its validity and whether it could be used effectively for early detection of visual field defects in glaucoma. The purpose of this study is to examine the benefit of adding mfVEP hemifield Intersector analysis protocol to the standard HFA test when there is suspicious glaucomatous visual field loss. 3 groups were tested in this study; normal controls (38 eyes), glaucoma patients (36 eyes) and glaucoma suspect patients (38 eyes). All subjects had a two standard Humphrey visual field HFA test 24-2, optical coherence tomography of the optic nerve head, and a single mfVEP test undertaken in one session. Analysis of the mfVEP results was done using the new analysis protocol; the Hemifield Sector Analysis HSA protocol. The retinal nerve fibre (RNFL) thickness was recorded to identify subjects with suspicious RNFL loss. The hemifield Intersector analysis of mfVEP results showed that signal to noise ratio (SNR) difference between superior and inferior hemifields was statistically significant between the 3 groups (ANOVA p<0.001 with a 95% CI). The difference between superior and inferior hemispheres in all subjects were all statistically significant in the glaucoma patient group 11/11 sectors (t-test p<0.001), partially significant 5/11 in glaucoma suspect group (t-test p<0.01) and no statistical difference between most sectors in normal group (only 1/11 was significant) (t-test p<0.9). Sensitivity and specificity of the HSA protocol in detecting glaucoma was 97% and 86% respectively, while for glaucoma suspect were 89% and 79%. The use of SAP and mfVEP results in subjects with suspicious glaucomatous visual field defects, identified by low RNFL thickness, is beneficial in confirming early visual field defects. The new HSA protocol used in the mfVEP testing can be used to detect glaucomatous visual field defects in both glaucoma and glaucoma suspect patient. Using this protocol in addition to SAP analysis can provide information about focal visual field differences across the horizontal midline, and confirm suspicious field defects. Sensitivity and specificity of the mfVEP test showed very promising results and correlated with other anatomical changes in glaucoma field loss. The Intersector analysis protocol can detect early field changes not detected by standard HFA test.
Resumo:
2000 Mathematics Subject Classification: 62P10, 92D10, 92D30, 62F03
Resumo:
This study was an evaluation of a Field Project Model Curriculum and its impact on achievement, attitude toward science, attitude toward the environment, self-concept, and academic self-concept with at-risk eleventh and twelfth grade students. One hundred eight students were pretested and posttested on the Piers-Harris Children's Self-Concept Scale, PHCSC (1985); the Self-Concept as a Learner Scale, SCAL (1978); the Marine Science Test, MST (1987); the Science Attitude Inventory, SAI (1970); and the Environmental Attitude Scale, EAS (1972). Using a stratified random design, three groups of students were randomly assigned according to sex and stanine level, to three treatment groups. Group one received the field project method, group two received the field study method, and group three received the field trip method. All three groups followed the marine biology course content as specified by Florida Student Performance Objectives and Frameworks. The intervention occurred for ten months with each group participating in outside-of-classroom activities on a trimonthly basis. Analysis of covariance procedures were used to determine treatment effects. F-ratios, p-levels and t-tests at p $<$.0062 (.05/8) indicated that a significant difference existed among the three treatment groups. Findings indicated that groups one and two were significantly different from group three with group one displaying significantly higher results than group two. There were no significant differences between males and females in performance on the five dependent variables. The tenets underlying environmental education are congruent with the recommendations toward the reform of science education. These include a value analysis approach, inquiry methods, and critical thinking strategies that are applied to environmental issues. ^
Resumo:
Context effects in a personality scale were examined by determining if conscientiousness scale (C) scores were significantly different when administered alone vs. part of a Five Factor Model inventory (Big5). The effectiveness of individual difference variables (IDVs) as predictors of the context effect was also examined. The experiment compared subjects who completed the full Big5 once and the C alone once (Big5/C or C/Big5) to subjects who complete either the Big5 inventory twice (Big5/Big5) or the C twice (C/C). No significant differences were found. When Big5/C and C/Big5 groups were combined, IDVs were tested, and only the field dependence variable (R2 = .06) was found to significantly predict the context effect. However, the small R2 minimized concerns of context effects in Big5 inventories. ^
Resumo:
The purpose of this study was to test Lotka’s law of scientific publication productivity using the methodology outlined by Pao (1985), in the field of Library and Information Studies (LIS). Lotka’s law has been sporadically tested in the field over the past 30+ years, but the results of these studies are inconclusive due to the varying methods employed by the researchers. ^ A data set of 1,856 citations that were found using the ISI Web of Knowledge databases were studied. The values of n and c were calculated to be 2.1 and 0.6418 (64.18%) respectively. The Kolmogorov-Smirnov (K-S) one sample goodness-of-fit test was conducted at the 0.10 level of significance. The Dmax value is 0.022758 and the calculated critical value is 0.026562. It was determined that the null hypothesis stating that there is no difference in the observed distribution of publications and the distribution obtained using Lotka’s and Pao’s procedure could not be rejected. ^ This study finds that literature in the field of Library and Information Studies does conform to Lotka’s law with reliable results. As result, Lotka’s law can be used in LIS as a standardized means of measuring author publication productivity which will lead to findings that are comparable on many levels (e.g., department, institution, national). Lotka’s law can be employed as an empirically proven analytical tool to establish publication productivity benchmarks for faculty and faculty librarians. Recommendations for further study include (a) exploring the characteristics of the high and low producers; (b) finding a way to successfully account for collaborative contributions in the formula; and, (c) a detailed study of institutional policies concerning publication productivity and its impact on the appointment, tenure and promotion process of academic librarians. ^
Resumo:
The move from Standard Definition (SD) to High Definition (HD) represents a six times increases in data, which needs to be processed. With expanding resolutions and evolving compression, there is a need for high performance with flexible architectures to allow for quick upgrade ability. The technology advances in image display resolutions, advanced compression techniques, and video intelligence. Software implementation of these systems can attain accuracy with tradeoffs among processing performance (to achieve specified frame rates, working on large image data sets), power and cost constraints. There is a need for new architectures to be in pace with the fast innovations in video and imaging. It contains dedicated hardware implementation of the pixel and frame rate processes on Field Programmable Gate Array (FPGA) to achieve the real-time performance. ^ The following outlines the contributions of the dissertation. (1) We develop a target detection system by applying a novel running average mean threshold (RAMT) approach to globalize the threshold required for background subtraction. This approach adapts the threshold automatically to different environments (indoor and outdoor) and different targets (humans and vehicles). For low power consumption and better performance, we design the complete system on FPGA. (2) We introduce a safe distance factor and develop an algorithm for occlusion occurrence detection during target tracking. A novel mean-threshold is calculated by motion-position analysis. (3) A new strategy for gesture recognition is developed using Combinational Neural Networks (CNN) based on a tree structure. Analysis of the method is done on American Sign Language (ASL) gestures. We introduce novel point of interests approach to reduce the feature vector size and gradient threshold approach for accurate classification. (4) We design a gesture recognition system using a hardware/ software co-simulation neural network for high speed and low memory storage requirements provided by the FPGA. We develop an innovative maximum distant algorithm which uses only 0.39% of the image as the feature vector to train and test the system design. Database set gestures involved in different applications may vary. Therefore, it is highly essential to keep the feature vector as low as possible while maintaining the same accuracy and performance^
Resumo:
Natural Resources Wales and Steven Griffiths are thanked for access to Caerwys quarry and permission to work on the site. Sebastiaan Edelman and Thomas Logeman assisted with fieldwork and provided some of the field photographs. Bouke Lacet (Sedimentology laboratory, VU University Amsterdam) prepared the thin-sections. Three anonymous reviewers helped to sharpen the manuscript, and Sherry Cady provided valuable editorial advice and assistance. A.T.B. was inspired by Martin Brasier. He dedicates this manuscript to his father’s memory.
Resumo:
Natural Resources Wales and Steven Griffiths are thanked for access to Caerwys quarry and permission to work on the site. Sebastiaan Edelman and Thomas Logeman assisted with fieldwork and provided some of the field photographs. Bouke Lacet (Sedimentology laboratory, VU University Amsterdam) prepared the thin-sections. Three anonymous reviewers helped to sharpen the manuscript, and Sherry Cady provided valuable editorial advice and assistance. A.T.B. was inspired by Martin Brasier. He dedicates this manuscript to his father’s memory.
Resumo:
Natural Resources Wales and Steven Griffiths are thanked for access to Caerwys quarry and permission to work on the site. Sebastiaan Edelman and Thomas Logeman assisted with fieldwork and provided some of the field photographs. Bouke Lacet (Sedimentology laboratory, VU University Amsterdam) prepared the thin-sections. Three anonymous reviewers helped to sharpen the manuscript, and Sherry Cady provided valuable editorial advice and assistance. A.T.B. was inspired by Martin Brasier. He dedicates this manuscript to his father’s memory.
Resumo:
In this thesis we study aspects of (0,2) superconformal field theories (SCFTs), which are suitable for compactification of the heterotic string. In the first part, we study a class of (2,2) SCFTs obtained by fibering a Landau-Ginzburg (LG) orbifold CFT over a compact K\"ahler base manifold. While such models are naturally obtained as phases in a gauged linear sigma model (GLSM), our construction is independent of such an embedding. We discuss the general properties of such theories and present a technique to study the massless spectrum of the associated heterotic compactification. We test the validity of our method by applying it to hybrid phases of GLSMs and comparing spectra among the phases. In the second part, we turn to the study of the role of accidental symmetries in two-dimensional (0,2) SCFTs obtained by RG flow from (0,2) LG theories. These accidental symmetries are ubiquitous, and, unlike in the case of (2,2) theories, their identification is key to correctly identifying the IR fixed point and its properties. We develop a number of tools that help to identify such accidental symmetries in the context of (0,2) LG models and provide a conjecture for a toric structure of the SCFT moduli space in a large class of models. In the final part, we study the stability of heterotic compactifications described by (0,2) GLSMs with respect to worldsheet instanton corrections to the space-time superpotential following the work of Beasley and Witten. We show that generic models elude the vanishing theorem proved there, and may not determine supersymmetric heterotic vacua. We then construct a subclass of GLSMs for which a vanishing theorem holds.
Resumo:
The purpose of this study was to test Lotka’s law of scientific publication productivity using the methodology outlined by Pao (1985), in the field of Library and Information Studies (LIS). Lotka’s law has been sporadically tested in the field over the past 30+ years, but the results of these studies are inconclusive due to the varying methods employed by the researchers. A data set of 1,856 citations that were found using the ISI Web of Knowledge databases were studied. The values of n and c were calculated to be 2.1 and 0.6418 (64.18%) respectively. The Kolmogorov-Smirnov (K-S) one sample goodness-of-fit test was conducted at the 0.10 level of significance. The Dmax value is 0.022758 and the calculated critical value is 0.026562. It was determined that the null hypothesis stating that there is no difference in the observed distribution of publications and the distribution obtained using Lotka’s and Pao’s procedure could not be rejected. This study finds that literature in the field of library and Information Studies does conform to Lotka’s law with reliable results. As result, Lotka’s law can be used in LIS as a standardized means of measuring author publication productivity which will lead to findings that are comparable on many levels (e.g., department, institution, national). Lotka’s law can be employed as an empirically proven analytical tool to establish publication productivity benchmarks for faculty and faculty librarians. Recommendations for further study include (a) exploring the characteristics of the high and low producers; (b) finding a way to successfully account for collaborative contributions in the formula; and, (c) a detailed study of institutional policies concerning publication productivity and its impact on the appointment, tenure and promotion process of academic librarians.
Resumo:
Large efforts are on-going within the EU to prepare the Marine Strategy Framework Directive’s (MSFD) assessment of the environmental status of the European seas. This assessment will only be as good as the indicators chosen to monitor the eleven descriptors of good environmental status (GEnS). An objective and transparent framework to determine whether chosen indicators actually support the aims of this policy is, however, not yet in place. Such frameworks are needed to ensure that the limited resources available to this assessment optimize the likelihood of achieving GEnS within collaborating states. Here, we developed a hypothesis-based protocol to evaluate whether candidate indicators meet quality criteria explicit to the MSFD, which the assessment community aspires to. Eight quality criteria are distilled from existing initiatives, and a testing and scoring protocol for each of them is presented. We exemplify its application in three worked examples, covering indicators for three GEnS descriptors (1, 5 and 6), various habitat components (seaweeds, seagrasses, benthic macrofauna and plankton), and assessment regions (Danish, Lithuanian and UK waters). We argue that this framework provides a necessary, transparent and standardized structure to support the comparison of candidate indicators, and the decision-making process leading to indicator selection. Its application could help identify potential limitations in currently available candidate metrics and, in such cases, help focus the development of more adequate indicators. Use of such standardized approaches will facilitate the sharing of knowledge gained across the MSFD parties despite context-specificity across assessment regions, and support the evidence-based management of European seas.