964 resultados para script-driven test program generation process
Resumo:
Mode of access: Internet.
Resumo:
"NOAA--S/T 76-2105"
Resumo:
"Contract N61339-318."
Resumo:
Cover title.
Resumo:
"April 1991." -- Cover.
Resumo:
"November 1990"--Cover.
Resumo:
"May 1988."
Resumo:
The electronics industry, is experiencing two trends one of which is the drive towards miniaturization of electronic products. The in-circuit testing predominantly used for continuity testing of printed circuit boards (PCB) can no longer meet the demands of smaller size circuits. This has lead to the development of moving probe testing equipment. Moving Probe Test opens up the opportunity to test PCBs where the test points are on a small pitch (distance between points). However, since the test uses probes that move sequentially to perform the test, the total test time is much greater than traditional in-circuit test. While significant effort has concentrated on the equipment design and development, little work has examined algorithms for efficient test sequencing. The test sequence has the greatest impact on total test time, which will determine the production cycle time of the product. Minimizing total test time is a NP-hard problem similar to the traveling salesman problem, except with two traveling salesmen that must coordinate their movements. The main goal of this thesis was to develop a heuristic algorithm to minimize the Flying Probe test time and evaluate the algorithm against a "Nearest Neighbor" algorithm. The algorithm was implemented with Visual Basic and MS Access database. The algorithm was evaluated with actual PCB test data taken from Industry. A statistical analysis with 95% C.C. was performed to test the hypothesis that the proposed algorithm finds a sequence which has a total test time less than the total test time found by the "Nearest Neighbor" approach. Findings demonstrated that the proposed heuristic algorithm reduces the total test time of the test and, therefore, production cycle time can be reduced through proper sequencing.
Resumo:
In the near future, the LHC experiments will continue to be upgraded as the LHC luminosity will increase from the design 1034 to 7.5 × 1034, with the HL-LHC project, to reach 3000 × f b−1 of accumulated statistics. After the end of a period of data collection, CERN will face a long shutdown to improve overall performance by upgrading the experiments and implementing more advanced technologies and infrastructures. In particular, ATLAS will upgrade parts of the detector, the trigger, and the data acquisition system. It will also implement new strategies and algorithms for processing and transferring the data to the final storage. This PhD thesis presents a study of a new pattern recognition algorithm to be used in the trigger system, which is a software designed to provide the information necessary to select physical events from background data. The idea is to use the well-known Hough Transform mathematical formula as an algorithm for detecting particle trajectories. The effectiveness of the algorithm has already been validated in the past, independently of particle physics applications, to detect generic shapes in images. Here, a software emulation tool is proposed for the hardware implementation of the Hough Transform, to reconstruct the tracks in the ATLAS Trigger and Data Acquisition system. Until now, it has never been implemented on electronics in particle physics experiments, and as a hardware implementation it would provide overall latency benefits. A comparison between the simulated data and the physical system was performed on a Xilinx UltraScale+ FPGA device.
Resumo:
Formal methods and software testing are tools to obtain and control software quality. When used together, they provide mechanisms for software specification, verification and error detection. Even though formal methods allow software to be mathematically verified, they are not enough to assure that a system is free of faults, thus, software testing techniques are necessary to complement the process of verification and validation of a system. Model Based Testing techniques allow tests to be generated from other software artifacts such as specifications and abstract models. Using formal specifications as basis for test creation, we can generate better quality tests, because these specifications are usually precise and free of ambiguity. Fernanda Souza (2009) proposed a method to define test cases from B Method specifications. This method used information from the machine s invariant and the operation s precondition to define positive and negative test cases for an operation, using equivalent class partitioning and boundary value analysis based techniques. However, the method proposed in 2009 was not automated and had conceptual deficiencies like, for instance, it did not fit in a well defined coverage criteria classification. We started our work with a case study that applied the method in an example of B specification from the industry. Based in this case study we ve obtained subsidies to improve it. In our work we evolved the proposed method, rewriting it and adding characteristics to make it compatible with a test classification used by the community. We also improved the method to support specifications structured in different components, to use information from the operation s behavior on the test case generation process and to use new coverage criterias. Besides, we have implemented a tool to automate the method and we have submitted it to more complex case studies
Resumo:
This paper presents a vision that allows the combined use of model-driven engineering, run-time monitoring, and animation for the development and analysis of components in real-time embedded systems. Key building block in the tool environment supporting this vision is a highly-customizable code generation process. Customization is performed via a configuration specification which describes the ways in which input is provided to the component, the ways in which run-time execution information can be observed, and how these observations drive animation tools. The environment is envisioned to be suitable for different activities ranging from quality assurance to supporting certification, teaching, and outreach and will be built exclusively with open source tools to increase impact. A preliminary prototype implementation is described.
Resumo:
Rupture of a light cellophane diaphragm in an expansion tube has been studied by an optical method. The influence of the light diaphragm on test flow generation has long been recognised, however the diaphragm rupture mechanism is less well known. It has been previously postulated that the diaphragm ruptures around its periphery due to the dynamic pressure loading of the shock wave, with the diaphragm material at some stage being removed from the flow to allow the shock to accelerate to the measured speeds downstream. The images obtained in this series of experiments are the first to show the mechanism of diaphragm rupture and mass removal in an expansion tube. A light diaphragm was impulsively loaded via a shock wave and a series of images was recorded holographically throughout the rupture process, showing gradual destruction of the diaphragm. Features such as the diaphragm material, the interface between gases, and a reflected shock were clearly visualised. Both qualitative and quantitative aspects of the rupture dynamics were derived from the images and compared with existing one-dimensional theory.
Resumo:
Films of BaFe12O19/P(VDF-TrFE) composites with 5, 10 and 20 %wt Barium ferrite content have been fabricated. BaFe12O19 microparticles have the shape of thin hexagonal platelets, the easy direction of magnetization remaining along the c axis, which is perpendicular to the plates. This fact allows for ferrite particles orientation in-plane and out-of-plane within the composite films, as confirmed by measured hysteresis loops. While the in-plane induced magnetoelectric effect (ME) is practically zero, these composite films show a good out-of-plane magnetoelectric effect. with maximum ME coupling coefficient changes of 3, 17 and 2 mV/cm.Oe for the 5, 10 and 20%wt Barium ferrite content films, respectively. We infer that this ME behavior appears as driven by the magnetization process arising when we applied the external magnetic field. We have also measured linear and reversible magnetoelectric effect for low applied bias field, when magnetization process is still reversible.
Resumo:
ABSTRACT Precision agriculture (PA) allows farmers to identify and address variations in an agriculture field. Management zones (MZs) make PA more feasible and economical. The most important method for defining MZs is a fuzzy C-means algorithm, but selecting the variable for use as the input layer in the fuzzy process is problematic. BAZZI et al. (2013) used Moran’s bivariate spatial autocorrelation statistic to identify variables that are spatially correlated with yield while employing spatial autocorrelation. BAZZI et al. (2013) proposed that all redundant variables be eliminated and that the remaining variables would be considered appropriate on the MZ generation process. Thus, the objective of this work, a study case, was to test the hypothesis that redundant variables can harm the MZ delineation process. BAZZI This work was conducted in a 19.6-ha commercial field, and 15 MZ designs were generated by a fuzzy C-means algorithm and divided into two to five classes. Each design used a different composition of variables, including copper, silt, clay, and altitude. Some combinations of these variables produced superior MZs. None of the variable combinations produced statistically better performance that the MZ generated with no redundant variables. Thus, the other redundant variables can be discredited. The design with all variables did not provide a greater separation and organization of data among MZ classes and was not recommended.