4 resultados para Computer Simulation

em Bucknell University Digital Commons - Pensilvania - USA


Relevância:

70.00% 70.00%

Publicador:

Resumo:

Simulation is an important resource for researchers in diverse fields. However, many researchers have found flaws in the methodology of published simulation studies and have described the state of the simulation community as being in a crisis of credibility. This work describes the project of the Simulation Automation Framework for Experiments (SAFE), which addresses the issues that undermine credibility by automating the workflow in the execution of simulation studies. Automation reduces the number of opportunities for users to introduce error in the scientific process thereby improvingthe credibility of the final results. Automation also eases the job of simulation users and allows them to focus on the design of models and the analysis of results rather than on the complexities of the workflow.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The separation of small molecules by capillary electrophoresis is governed by a complex interplay among several physical effects. Until recently, a systematic understanding of how the influence of all of these effects is observed experimentally has remained unclear. The work presented in this thesis involves the use of transient isotachophoretic stacking (tITP) and computer simulation to improve and better understand an in-capillary chemical assay for creatinine. This assay involves the use of electrophoretically mediated micro-analysis (EMMA) to carry out the Jaffé reaction inside a capillary tube. The primary contribution of this work is the elucidation of the role of the length and concentration of the hydroxide plug used to achieve tITP stacking of the product formed by the in-capillary EMMA/Jaffé method. Computer simulation using SIMUL 5.0 predicts that a 3-4 fold gain in sensitivity can be recognized by timing the tITP stacking event such that the Jaffé product peak is at its maximum height as that peak is electrophoresing past the detection window. Overall, the length of the hydroxide plug alters the timing of the stacking event and lower concentration plugs of hydroxide lead to more rapidly occurring tITP stacking events. Also, the inclusion of intentional tITP stacking in the EMMA/Jaffé method improves the sensitivity of the assay, including creatinine concentrations within the normal biological range. Ultimately, improvement in assay sensitivity can be rationally designed by using the length and concentration of the hydroxide plug to engineer the timing of the tITP stacking event such that stacking occurs as the Jaffé product is passing the detection window.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In the past few decades, integrated circuits have become a major part of everyday life. Every circuit that is created needs to be tested for faults so faulty circuits are not sent to end-users. The creation of these tests is time consuming, costly and difficult to perform on larger circuits. This research presents a novel method for fault detection and test pattern reduction in integrated circuitry under test. By leveraging the FPGA's reconfigurability and parallel processing capabilities, a speed up in fault detection can be achieved over previous computer simulation techniques. This work presents the following contributions to the field of Stuck-At-Fault detection: We present a new method for inserting faults into a circuit net list. Given any circuit netlist, our tool can insert multiplexers into a circuit at correct internal nodes to aid in fault emulation on reconfigurable hardware. We present a parallel method of fault emulation. The benefit of the FPGA is not only its ability to implement any circuit, but its ability to process data in parallel. This research utilizes this to create a more efficient emulation method that implements numerous copies of the same circuit in the FPGA. A new method to organize the most efficient faults. Most methods for determinin the minimum number of inputs to cover the most faults require sophisticated softwareprograms that use heuristics. By utilizing hardware, this research is able to process data faster and use a simpler method for an efficient way of minimizing inputs.