945 resultados para Computer programs - Testing


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction: Computer-Aided-Design (CAD) and Computer-Aided-Manufacture (CAM) has been developed to fabricate fixed dental restorations accurately, faster and improve cost effectiveness of manufacture when compared to the conventional method. Two main methods exist in dental CAD/CAM technology: the subtractive and additive methods. While fitting accuracy of both methods has been explored, no study yet has compared the fabricated restoration (CAM output) to its CAD in terms of accuracy. The aim of this present study was to compare the output of various dental CAM routes to a sole initial CAD and establish the accuracy of fabrication. The internal fit of the various CAM routes were also investigated. The null hypotheses tested were: 1) no significant differences observed between the CAM output to the CAD and 2) no significant differences observed between the various CAM routes. Methods: An aluminium master model of a standard premolar preparation was scanned with a contact dental scanner (Incise, Renishaw, UK). A single CAD was created on the scanned master model (InciseCAD software, V2.5.0.140, UK). Twenty copings were then fabricated by sending the single CAD to a multitude of CAM routes. The copings were grouped (n=5) as: Laser sintered CoCrMo (LS), 5-axis milled CoCrMo (MCoCrMo), 3-axis milled zirconia (ZAx3) and 4-axis milled zirconia (ZAx4). All copings were micro-CT scanned (Phoenix X-Ray, Nanotom-S, Germany, power: 155kV, current: 60µA, 3600 projections) to produce 3-Dimensional (3D) models. A novel methodology was created to superimpose the micro-CT scans with the CAD (GOM Inspect software, V7.5SR2, Germany) to indicate inaccuracies in manufacturing. The accuracy in terms of coping volume was explored. The distances from the surfaces of the micro-CT 3D models to the surfaces of the CAD model (CAD Deviation) were investigated after creating surface colour deviation maps. Localised digital sections of the deviations (Occlusal, Axial and Cervical) and selected focussed areas were then quantitatively measured using software (GOM Inspect software, Germany). A novel methodology was also explored to digitally align (Rhino software, V5, USA) the micro-CT scans with the master model to investigate internal fit. Fifty digital cross sections of the aligned scans were created. Point-to-point distances were measured at 5 levels at each cross section. The five levels were: Vertical Marginal Fit (VF), Absolute Marginal Fit (AM), Axio-margin Fit (AMF), Axial Fit (AF) and Occlusal Fit (OF). Results: The results of the volume measurement were summarised as: VM-CoCrMo (62.8mm3 ) > VZax3 (59.4mm3 ) > VCAD (57mm3 ) > VZax4 (56.1mm3 ) > VLS (52.5mm3 ) and were all significantly different (p presented as areas with different colour. No significant differences were observed at the internal aspect of the cervical aspect between all groups of copings. Significant differences (p< M-CoCrMo Internal Occlusal, Internal Axial and External Axial 2 ZAx3 > ZAx4 External Occlusal, External Cervical 3 ZAx3 < ZAx4 Internal Occlusal 4 M-CoCrMo > ZAx4 Internal Occlusal and Internal Axial The mean values of AMF and AF were significantly (p M-CoCrMo and CAD > ZAx4. Only VF of M-CoCrMo was comparable with the CAD Internal Fit. All VF and AM values were within the clinically acceptable fit (120µm). Conclusion: The investigated CAM methods reproduced the CAD accurately at the internal cervical aspect of the copings. However, localised deviations at axial and occlusal aspects of the copings may suggest the need for modifications in these areas prior to fitting and veneering with porcelain. The CAM groups evaluated also showed different levels of Internal Fit thus rejecting the null hypotheses. The novel non-destructive methodologies for CAD/CAM accuracy and internal fit testing presented in this thesis may be a useful evaluation tool for similar applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Much of the bridge stock on major transport links in North America and Europe was constructed in the 1950’s and 1960’s and has since deteriorated or is carrying loads far in excess of the original design loads. Structural Health Monitoring Systems (SHM) can provide valuable information on the bridge capacity but the application of such systems is currently limited by access and system cost. This paper investigates the development of a low cost portable SHM system using commercially available cameras and computer vision techniques. A series of laboratory tests have been carried out to test the accuracy of displacement measurements using contactless methods. The results from each of the tests have been validated with established measurement methods, such as linear variable differential transformers (LVDTs). A video image of each test was processed using two different digital image correlation programs. The results obtained from the digital image correlation methods provided an accurate comparison with the validation measurements. The calculated displacements agree within 4% of the verified measurements LVDT measurements in most cases confirming the suitability full camera based SHM systems

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ageing and deterioration of infrastructure is a challenge facing transport authorities. In
particular, there is a need for increased bridge monitoring in order to provide adequate
maintenance and to guarantee acceptable levels of transport safety. The Intelligent
Infrastructure group at Queens University Belfast (QUB) are working on a number of aspects
of infrastructure monitoring and this paper presents summarised results from three distinct
monitoring projects carried out by this group. Firstly the findings from a project on next
generation Bridge Weight in Motion (B-WIM) are reported, this includes full scale field testing
using fibre optic strain sensors. Secondly, results from early phase testing of a computer
vision system for bridge deflection monitoring are reported on. This research seeks to exploit
recent advances in image processing technology with a view to developing contactless
bridge monitoring approaches. Considering the logistical difficulty of installing sensors on a
‘live’ bridge, contactless monitoring has some inherent advantages over conventional
contact based sensing systems. Finally the last section of the paper presents some recent
findings on drive by bridge monitoring. In practice a drive-by monitoring system will likely
require GPS to allow the response of a given bridge to be identified; this study looks at the
feasibility of using low-cost GPS sensors for this purpose, via field trials. The three topics
outlined above cover a spectrum of SHM approaches namely, wired monitoring, contactless
monitoring and drive by monitoring

Relevância:

30.00% 30.00%

Publicador:

Resumo:

COSTA, Umberto Souza; MOREIRA, Anamaria Martins; MUSICANTE, Matin A.; SOUZA NETO, Plácido A. JCML: A specification language for the runtime verification of Java Card programs. Science of Computer Programming. [S.l]: [s.n], 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

COSTA, Umberto Souza da; MOREIRA, Anamaria Martins; MUSICANTE, Martin A. Specification and Runtime Verification of Java Card Programs. Electronic Notes in Theoretical Computer Science. [S.l:s.n], 2009.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many-core systems are emerging from the need of more computational power and power efficiency. However there are many issues which still revolve around the many-core systems. These systems need specialized software before they can be fully utilized and the hardware itself may differ from the conventional computational systems. To gain efficiency from many-core system, programs need to be parallelized. In many-core systems the cores are small and less powerful than cores used in traditional computing, so running a conventional program is not an efficient option. Also in Network-on-Chip based processors the network might get congested and the cores might work at different speeds. In this thesis is, a dynamic load balancing method is proposed and tested on Intel 48-core Single-Chip Cloud Computer by parallelizing a fault simulator. The maximum speedup is difficult to obtain due to severe bottlenecks in the system. In order to exploit all the available parallelism of the Single-Chip Cloud Computer, a runtime approach capable of dynamically balancing the load during the fault simulation process is used. The proposed dynamic fault simulation approach on the Single-Chip Cloud Computer shows up to 45X speedup compared to a serial fault simulation approach. Many-core systems can draw enormous amounts of power, and if this power is not controlled properly, the system might get damaged. One way to manage power is to set power budget for the system. But if this power is drawn by just few cores of the many, these few cores get extremely hot and might get damaged. Due to increase in power density multiple thermal sensors are deployed on the chip area to provide realtime temperature feedback for thermal management techniques. Thermal sensor accuracy is extremely prone to intra-die process variation and aging phenomena. These factors lead to a situation where thermal sensor values drift from the nominal values. This necessitates efficient calibration techniques to be applied before the sensor values are used. In addition, in modern many-core systems cores have support for dynamic voltage and frequency scaling. Thermal sensors located on cores are sensitive to the core's current voltage level, meaning that dedicated calibration is needed for each voltage level. In this thesis a general-purpose software-based auto-calibration approach is also proposed for thermal sensors to calibrate thermal sensors on different range of voltages.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large component-based systems are often built from many of the same components. As individual component-based software systems are developed, tested and maintained, these shared components are repeatedly manipulated. As a result there are often significant overlaps and synergies across and among the different test efforts of different component-based systems. However, in practice, testers of different systems rarely collaborate, taking a test-all-by-yourself approach. As a result, redundant effort is spent testing common components, and important information that could be used to improve testing quality is lost. The goal of this research is to demonstrate that, if done properly, testers of shared software components can save effort by avoiding redundant work, and can improve the test effectiveness for each component as well as for each component-based software system by using information obtained when testing across multiple components. To achieve this goal I have developed collaborative testing techniques and tools for developers and testers of component-based systems with shared components, applied the techniques to subject systems, and evaluated the cost and effectiveness of applying the techniques. The dissertation research is organized in three parts. First, I investigated current testing practices for component-based software systems to find the testing overlap and synergy we conjectured exists. Second, I designed and implemented infrastructure and related tools to facilitate communication and data sharing between testers. Third, I designed two testing processes to implement different collaborative testing algorithms and applied them to large actively developed software systems. This dissertation has shown the benefits of collaborative testing across component developers who share their components. With collaborative testing, researchers can design algorithms and tools to support collaboration processes, achieve better efficiency in testing configurations, and discover inter-component compatibility faults within a minimal time window after they are introduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

COSTA, Umberto Souza; MOREIRA, Anamaria Martins; MUSICANTE, Matin A.; SOUZA NETO, Plácido A. JCML: A specification language for the runtime verification of Java Card programs. Science of Computer Programming. [S.l]: [s.n], 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

COSTA, Umberto Souza da; MOREIRA, Anamaria Martins; MUSICANTE, Martin A. Specification and Runtime Verification of Java Card Programs. Electronic Notes in Theoretical Computer Science. [S.l:s.n], 2009.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the increasing complexity of today's software, the software development process is becoming highly time and resource consuming. The increasing number of software configurations, input parameters, usage scenarios, supporting platforms, external dependencies, and versions plays an important role in expanding the costs of maintaining and repairing unforeseeable software faults. To repair software faults, developers spend considerable time in identifying the scenarios leading to those faults and root-causing the problems. While software debugging remains largely manual, it is not the case with software testing and verification. The goal of this research is to improve the software development process in general, and software debugging process in particular, by devising techniques and methods for automated software debugging, which leverage the advances in automatic test case generation and replay. In this research, novel algorithms are devised to discover faulty execution paths in programs by utilizing already existing software test cases, which can be either automatically or manually generated. The execution traces, or alternatively, the sequence covers of the failing test cases are extracted. Afterwards, commonalities between these test case sequence covers are extracted, processed, analyzed, and then presented to the developers in the form of subsequences that may be causing the fault. The hypothesis is that code sequences that are shared between a number of faulty test cases for the same reason resemble the faulty execution path, and hence, the search space for the faulty execution path can be narrowed down by using a large number of test cases. To achieve this goal, an efficient algorithm is implemented for finding common subsequences among a set of code sequence covers. Optimization techniques are devised to generate shorter and more logical sequence covers, and to select subsequences with high likelihood of containing the root cause among the set of all possible common subsequences. A hybrid static/dynamic analysis approach is designed to trace back the common subsequences from the end to the root cause. A debugging tool is created to enable developers to use the approach, and integrate it with an existing Integrated Development Environment. The tool is also integrated with the environment's program editors so that developers can benefit from both the tool suggestions, and their source code counterparts. Finally, a comparison between the developed approach and the state-of-the-art techniques shows that developers need only to inspect a small number of lines in order to find the root cause of the fault. Furthermore, experimental evaluation shows that the algorithm optimizations lead to better results in terms of both the algorithm running time and the output subsequence length.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this work was to develop an application capable of determining the diffusion times and diffusion coefficients of optical clearing agents and water inside a known type of muscle. Different types of chemical agents can also be used with the method implemented, such as medications or metabolic products. Since the diffusion times can be calculated, it is possible to describe the dehydration mechanism that occurs in the muscle. The calculation of the diffusion time of an optical clearing agent allows to characterize the refractive index matching mechanism of optical clearing. By using both the diffusion times and diffusion of water and clearing agents not only the optical clearing mechanisms are characterized, but also information about optical clearing effect duration and magnitude is obtained. Such information is crucial to plan a clinical intervention in cooperation with optical clearing. The experimental method and equations implemented in the developed application are described in throughout this document, demonstrating its effectiveness. The application was developed in MATLAB code, but the method was personalized so it better fits the application needs. This process significantly improved the processing efficiency, reduced the time to obtain he results, multiple validations prevents common errors and some extra functionalities were added such as saving application progress or export information in different formats. Tests were made using glucose measurements in muscle. Some of the data, for testing purposes, was also intentionally changed in order to obtain different simulations and results from the application. The entire project was validated by comparing the calculated results with the ones found in literature, which are also described in this document.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The principalship has changed significantly over the past 20 years. Today’s principals must be effective instructional leaders, managers of large facilities, and experts at analyzing data to successfully meet the accountability demands of high-stakes testing, along with state, and federal mandates. The primary purpose of this quantitative study was to examine how 43 first- and second-year sitting school principals perceived their mentoring experiences and the degree to which a principal mentoring program—offered by their large urban school district—was effective in building their leadership capacity. A second purpose of this inquiry was to understand these principals’ perceptions of the most beneficial aspects of the mentoring program. The study used quantitative data gathered via an online questionnaire distributed during Fall 2015. The results indicated that respondents perceived that the components of the large urban school-mentoring program were generally effective in training principal mentees to become highly-effective school leaders. This study enriches the literature on mentoring by providing the voices of first and second year school leaders to add depth to the characteristics of successful mentoring programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research investigated faking across test administration modes in an employment testing scenario. For the first time, phone administration was included. Participants (N = 91) were randomly allocated to testing mode (telephone, Internet, or pen-and-paper). Participants completed a personality measure under standard instructions and then under instructions to fake as an ideal police applicant. No significant difference in any faked personality domains as a function of administration mode was found. Effect sizes indicated that the influence of administration mode was small. Limitations and future directions are considered. Overall, results indicate that if an individual intends to fake on a self-report test in a vocational assessment scenario, the electronic administration mode in which the test is delivered may be unimportant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer simulation programs are essential tools for scientists and engineers to understand a particular system of interest. As expected, the complexity of the software increases with the depth of the model used. In addition to the exigent demands of software engineering, verification of simulation programs is especially challenging because the models represented are complex and ridden with unknowns that will be discovered by developers in an iterative process. To manage such complexity, advanced verification techniques for continually matching the intended model to the implemented model are necessary. Therefore, the main goal of this research work is to design a useful verification and validation framework that is able to identify model representation errors and is applicable to generic simulators. The framework that was developed and implemented consists of two parts. The first part is First-Order Logic Constraint Specification Language (FOLCSL) that enables users to specify the invariants of a model under consideration. From the first-order logic specification, the FOLCSL translator automatically synthesizes a verification program that reads the event trace generated by a simulator and signals whether all invariants are respected. The second part consists of mining the temporal flow of events using a newly developed representation called State Flow Temporal Analysis Graph (SFTAG). While the first part seeks an assurance of implementation correctness by checking that the model invariants hold, the second part derives an extended model of the implementation and hence enables a deeper understanding of what was implemented. The main application studied in this work is the validation of the timing behavior of micro-architecture simulators. The study includes SFTAGs generated for a wide set of benchmark programs and their analysis using several artificial intelligence algorithms. This work improves the computer architecture research and verification processes as shown by the case studies and experiments that have been conducted.